Movatterモバイル変換


[0]ホーム

URL:


CN119728047A - A data transmission method, device, communication equipment and storage medium - Google Patents

A data transmission method, device, communication equipment and storage medium
Download PDF

Info

Publication number
CN119728047A
CN119728047ACN202311280326.XACN202311280326ACN119728047ACN 119728047 ACN119728047 ACN 119728047ACN 202311280326 ACN202311280326 ACN 202311280326ACN 119728047 ACN119728047 ACN 119728047A
Authority
CN
China
Prior art keywords
type
network
data packet
pdu
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311280326.XA
Other languages
Chinese (zh)
Inventor
王凯悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
Research Institute of China Mobile Communication Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
Research Institute of China Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, Research Institute of China Mobile Communication Co LtdfiledCriticalChina Mobile Communications Group Co Ltd
Priority to CN202311280326.XApriorityCriticalpatent/CN119728047A/en
Publication of CN119728047ApublicationCriticalpatent/CN119728047A/en
Pendinglegal-statusCriticalCurrent

Links

Landscapes

Abstract

Translated fromChinese

本发明实施例公开了一种数据传输方法、装置、通信设备和存储介质。所述方法包括:第一网络功能接收承载神经网络数据的数据包,确定所述数据包所属的协议数据单元(PDU)集合的类型,至少基于所述PDU集合的类型在所述数据包中添加第一指示信息;所述第一指示信息至少用于指示所述PDU集合的类型对应的第一服务质量(QoS)处理策略;不同的PDU集合的类型对应不同的QoS处理策略;向接入网设备发送添加有所述第一指示信息的数据包。

The embodiment of the present invention discloses a data transmission method, apparatus, communication device and storage medium. The method includes: a first network function receives a data packet carrying neural network data, determines the type of the protocol data unit (PDU) set to which the data packet belongs, and adds first indication information to the data packet based on at least the type of the PDU set; the first indication information is at least used to indicate a first quality of service (QoS) processing strategy corresponding to the type of the PDU set; different types of PDU sets correspond to different QoS processing strategies; and sends a data packet with the first indication information added to an access network device.

Description

Data transmission method, device, communication equipment and storage medium
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a data transmission method, a data transmission device, a communication device, and a storage medium.
Background
The data volume of the neural network model is very large, usually more than hundred megabytes, and in order to ensure the transmission efficiency and the transmission quality of the data packet of the neural network model, a hierarchical guarantee transmission strategy needs to be formulated. At present, no hierarchical transmission guarantee strategy for carrying out communication network transmission on different types of artificial intelligence (AI, artificial Intelligence) models exists.
Disclosure of Invention
In order to solve the existing technical problems, the embodiment of the invention provides a data transmission method, a device, communication equipment and a storage medium.
In order to achieve the above object, the technical solution of the embodiment of the present invention is as follows:
In a first aspect, an embodiment of the present invention provides a data transmission method, where the method is applied to a first network function, and the method includes:
Receiving a data packet carrying neural network data, determining the type of a protocol data unit (PDU, protocol Data Unit) set to which the data packet belongs, and adding first indication information into the data packet at least based on the type of the PDU set, wherein the first indication information is at least used for indicating a first quality of service (QoS, quality of Service) processing strategy corresponding to the type of the PDU set;
And sending the data packet added with the first indication information to access network equipment.
In the above scheme, one PDU set is used for transmitting data of one type of streaming neural network, and/or one or more PDUs in one PDU set are used for transmitting data of a sub-network in the streaming neural network.
In the scheme, the PDU set is of a frequency domain type, a time type or a space type, and/or the fluid neural network is of a frequency domain type, a time type or a space type.
In the above scheme, the widths of the plurality of sub-networks included in the fluidized neural network are sequentially increased;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In the above scheme, adding the first indication information into the data packet at least based on the type of the PDU set comprises adding the first indication information into the data packet based on the type of the PDU set and the width of the sub-network corresponding to the data packet.
In the above solution, when the type of the PDU set is a frequency domain type, the first indication information is used to indicate a PDU set of the frequency domain type and a first QoS processing policy corresponding to a first width range of a sub-network corresponding to the data packet.
In the above scheme, the data packet includes a first identifier, where the first identifier indicates a type of a PDU set to which the data packet belongs;
The determining the type of the PDU set to which the data packet belongs comprises determining the type of the PDU set to which the data packet belongs based on the first identification.
In the above scheme, the data packet includes a second identifier, where the second identifier represents a first width range where a width of a sub-network corresponding to the data packet is located.
In the above solution, before receiving the data packet carrying the neural network data, the method further includes:
And receiving first information sent by the second network function, wherein the first information comprises configuration information related to QoS processing strategies.
In a second aspect, an embodiment of the present invention further provides a data transmission method, where the method is applied to an access network device, and the method includes:
receiving a data packet carrying neural network data sent by a first network function, wherein the data packet carries first indication information which is at least used for indicating a first QoS processing strategy corresponding to the type of a PDU set to which the data packet belongs;
and processing the data packet carrying the same first indication information according to the first QoS processing strategy.
In the above scheme, one PDU set is used for transmitting data of one type of streaming neural network, and/or one or more PDUs in one PDU set are used for transmitting data of a sub-network in the streaming neural network.
In the scheme, the PDU set is of a frequency domain type, a time type or a space type, and/or the fluid neural network is of a frequency domain type, a time type or a space type.
In the above scheme, the widths of the plurality of sub-networks included in the fluidized neural network are sequentially increased;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In the above solution, when the type of the PDU set is a frequency domain type, the first indication information is used to indicate a PDU set of the frequency domain type and a first QoS processing policy corresponding to a first width range of a sub-network corresponding to the data packet.
In the above solution, when the type of the PDU set is a frequency domain type, the method further includes:
Preferentially guaranteeing the transmission of the data packet corresponding to the first PDU in the PDU set, and/or,
And detecting that the transmission of data packets corresponding to other PDUs except the first PDU in the PDU set is unsuccessful, and not processing the data packets.
In the above solution, when the type of the PDU set is a spatial type, the method further includes:
And detecting that the transmission of the data packet corresponding to any PDU in the PDU set is unsuccessful, stopping transmitting the data packet to the terminal, and discarding the data packet which is not transmitted.
In the above solution, when the type of the PDU set is a time type, the method further includes:
Preferentially guaranteeing the transmission of the data packet corresponding to the first PDU in the PDU set, and/or,
And detecting that the transmission of the data packets corresponding to other PDUs except the first PDU in the PDU set is unsuccessful, and retransmitting the data packets corresponding to the other PDUs to the terminal.
In the above scheme, the first QoS treatment policy includes a treatment policy related to at least one parameter selected from bandwidth, delay, packet loss rate threshold, and packet priority.
In the above scheme, the method further comprises receiving second information from a second network function via a third network function, wherein the second information comprises configuration information related to the QoS processing policy.
In a third aspect, an embodiment of the present invention further provides a data transmission method, where the method is applied to a first network device, and the method includes:
Determining a first identifier corresponding to the type of a PDU set based on the type of a neural network, and adding the first identifier into a data packet belonging to the PDU set, wherein the data packet is used for bearing the data of the neural network;
and transmitting the data packet added with the first identifier.
In the above scheme, one PDU set is used for transmitting data of one type of streaming neural network, and/or one or more PDUs in one PDU set are used for transmitting data of a sub-network in the streaming neural network.
In the scheme, the PDU set is of a frequency domain type, a time type or a space type, and/or the fluid neural network is of a frequency domain type, a time type or a space type.
In the above scheme, the widths of the plurality of sub-networks included in the fluidized neural network are sequentially increased;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In the above scheme, under the condition that the type of the PDU set is the frequency domain type, the method further comprises determining a second identifier based on the width of a sub-network in the neural network, and adding the second identifier into a data packet corresponding to the sub-network, wherein different width ranges correspond to different second identifiers.
In a fourth aspect, an embodiment of the present invention further provides a data transmission method, where the method is applied to a second network function, and the method includes:
determining QoS processing strategies corresponding to PDU sets of different types, wherein the types of the PDU sets are related to the types of the neural network;
And sending the first information to the first network function and/or sending the second information to the access network equipment through the third network function, wherein the first information and the second information comprise configuration information related to QoS processing strategies.
In the above scheme, one PDU set is used for transmitting data of one type of streaming neural network, and/or one or more PDUs in one PDU set are used for transmitting data of a sub-network in the streaming neural network.
In the scheme, the PDU set is of a frequency domain type, a time type or a space type, and/or the fluid neural network is of a frequency domain type, a time type or a space type.
In the above scheme, the widths of the plurality of sub-networks included in the fluidized neural network are sequentially increased;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In a fifth aspect, the embodiment of the invention further provides a data transmission device, which is applied to a first network function, and comprises a first communication unit and a first processing unit, wherein,
The first communication unit is used for receiving a data packet carrying the data of the neural network;
The first processing unit is used for determining the type of the PDU set to which the data packet belongs, and adding first indication information into the data packet at least based on the type of the PDU set, wherein the first indication information is at least used for indicating a first QoS (quality of service) processing strategy corresponding to the type of the PDU set;
the first communication unit is further configured to send a data packet to which the first indication information is added to an access network device.
In a sixth aspect, the embodiment of the invention further provides a data transmission device, which is applied to access network equipment, and comprises a second communication unit and a second processing unit, wherein,
The second communication unit is configured to receive a data packet carrying neural network data sent by a first network function, where the data packet carries first indication information, where the first indication information is at least used to indicate a first QoS processing policy corresponding to a type of a PDU set to which the data packet belongs;
And the second processing unit is used for processing the data packet carrying the same first indication information according to the first QoS processing strategy.
In a seventh aspect, the embodiment of the invention further provides a data transmission device, which is applied to the first network equipment, and comprises a third processing unit and a third communication unit, wherein,
The third processing unit is used for determining a first identifier corresponding to the type of the PDU set based on the type of the neural network, and adding the first identifier into a data packet belonging to the PDU set, wherein the data packet is used for bearing the data of the neural network;
The third communication unit is configured to send a data packet to which the first identifier is added.
In an eighth aspect, the embodiment of the present invention further provides a data transmission apparatus, where the apparatus is applied to a second network function, and the apparatus includes a fourth processing unit and a fourth communication unit, where,
The fourth processing unit is used for determining QoS processing strategies corresponding to PDU sets of different types, wherein the type of the PDU set is related to the type of the neural network;
The fourth communication unit is configured to send first information to the first network function, and/or send second information to the access network device through the third network function, where both the first information and the second information include configuration information related to the QoS processing policy.
In a ninth aspect, an embodiment of the present invention further provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the data transmission method according to any one of the above-mentioned aspects of the embodiment of the present invention.
In a tenth aspect, an embodiment of the present invention further provides a communication device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor executes the program to implement the steps of the data transmission method according to any one of the foregoing aspects of the embodiments of the present invention.
The data transmission method, the device, the communication equipment and the storage medium comprise the steps that a first network function receives a data packet carrying neural network data, determines the type of a PDU set to which the data packet belongs, adds first indication information into the data packet at least based on the type of the PDU set, wherein the first indication information is at least used for indicating a first quality of service (QoS) processing strategy corresponding to the type of the PDU set, different PDU set types correspond to different QoS processing strategies, and sends the data packet added with the first indication information to access network equipment. By adopting the technical scheme of the embodiment of the invention, the data packet is processed according to the QoS processing strategy corresponding to the type of the PDU set by classifying the PDU set carrying the neural network data, and the hierarchical transmission guarantee strategy for the transmission of the neural network data is provided by processing the data packet according to the QoS processing strategy corresponding to the type of the PDU set.
Drawings
Fig. 1 is a flowchart of a data transmission method according to an embodiment of the invention;
FIGS. 2a to 2c are schematic diagrams showing various types of variable signals of a fluidizable nerve field in a data transmission method according to an embodiment of the present invention;
fig. 3 is a second flow chart of a data transmission method according to an embodiment of the invention;
fig. 4 is a flowchart illustrating a data transmission method according to an embodiment of the present invention;
fig. 5 is a flow chart of a data transmission method according to an embodiment of the invention;
fig. 6 is a schematic diagram of an interaction flow of a data transmission method according to an embodiment of the present invention;
fig. 7 is a schematic diagram of an interaction flow of a data transmission method according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a composition structure of a data transmission device according to an embodiment of the present invention;
Fig. 9 is a schematic diagram of a second component structure of a data transmission device according to an embodiment of the present invention;
Fig. 10 is a schematic diagram of a data transmission device according to an embodiment of the present invention;
Fig. 11 is a schematic diagram of a composition structure of a data transmission device according to an embodiment of the present invention;
fig. 12 is a schematic diagram of a hardware composition structure of a communication device according to an embodiment of the present invention.
Detailed Description
The invention will be described in further detail with reference to the accompanying drawings and specific examples.
The technical scheme of the embodiment of the invention can be applied to various communication systems, such as a global system for mobile communication (GSM, global System of Mobile communication) system, a long term evolution (LTE, long Term Evolution) system or a 5G system. Alternatively, the 5G system or 5G network may also be referred to as a New Radio (NR) system or NR network.
By way of example, the communication system to which the embodiments of the present invention are applied may include a network device and a terminal device (may also be referred to as a terminal, a communication terminal, etc.), and the network device may be a device that communicates with the terminal device. Wherein the network device may provide communication coverage for a range of areas and may communicate with terminals located within the areas. Alternatively, the network device may be a base station in each communication system, such as an evolved base station (eNB, evolutional Node B) in an LTE system, and also such as a base station (gNB) in a 5G system or an NR system.
It should be understood that a device having a communication function in a network/system according to an embodiment of the present application may be referred to as a communication device. The communication device may include a network device and a terminal with a communication function, which may be specific devices described above and will not be described herein, and the communication device may further include other devices in the communication system, for example, a network controller, a mobility management entity, and other network entities, which are not limited in the embodiment of the present application.
It should be understood that the terms "system" and "network" are used interchangeably herein. The term "and/or" is merely an association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B may mean that a exists alone, while a and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Before describing embodiments of the present invention in detail, a brief description of the fluidized neural field is first provided.
A field is a scalar defined by all consecutive spatial and/or temporal coordinates, and a vector of a high dimension can be mapped to a scalar. A neural field refers to a field that is fully or partially parameterized with a neural network. Nerve fields are often parameterized as multi-layer perceptrons with activation functions that can effectively regress complex signals with significant success in various signal characterizations of images, 3D shapes, etc. The executable sub-networks are formed separately from a single trained network, having different widths. The fluidized neural field is a single model consisting of executable subnetworks of different widths, which can fluidize the single network over time and reconstruct signals of different masses and different portions.
The transmission capability of the future 6G network is stronger, and the calculation processing function of the network side is also stronger. The complex AI calculation is placed at the cloud for processing, and then network parameters of the neural network are transmitted to the terminal in a streaming mode through the wireless communication network to reconstruct signals.
However, no hierarchical transmission guarantee policy for communication network transmission is currently provided for different kinds of artificial intelligence (AI, artificial Intelligence) models, and no specific quality of service (QoS, quality of Service) guarantee rule is provided for the transmission process of the neural field data packet. Therefore, enhancement processing is required for QoS parameters of the transmission of the set of nerve field protocol data units (PDUs, protocol Data Unit) packets.
The embodiment of the invention provides a data transmission method, which is applied to a first network function. Fig. 1 is a flow chart of a data transmission method according to an embodiment of the invention, and as shown in fig. 1, the method includes:
step 101, receiving a data packet carrying neural network data, determining the type of a PDU set to which the data packet belongs, and adding first indication information into the data packet at least based on the type of the PDU set, wherein the first indication information is at least used for indicating a first QoS processing strategy corresponding to the type of the PDU set;
and 102, sending the data packet added with the first indication information to access network equipment.
In this embodiment, the first network function is a network function mainly used for forwarding or routing user plane data, in a 5G network, the first network function may be a user plane function (UPF, user Plane Function), and in other networks, even future networks, the first network function may be any other network function used for processing or forwarding or routing user plane data, which is not limited in this embodiment.
In order to facilitate QoS guarantee for one type of PDU, the concept of PDU set (PDU set) appears. The set of PDUs may comprise at least one PDU, being a payload of an information unit carried at the application layer. While a trained class of neural network may correspond to a set of PDUs through which the data of the neural network is carried. In this embodiment, the receiving the packet of the data of the carrier neural network may also be receiving a PDU set or PDU of the carrier neural network model or the neural network data, and correspondingly determining a type of the PDU set or determining a type of a PDU set to which the PDU belongs.
In this embodiment, the neural network may also correspond to a neural field, i.e. a field that is parameterized in whole or in part by the neural network. In other alternative embodiments, the neural network may also correspond to a fluidized neural field or a fluidized neural network. The fluidized neural field or fluidized neural network may separate a single trained network into a plurality of different-width executable networks, which may reconstruct signals of different masses or different portions over time. An executable network (or subnetwork) can only produce signals in a particular frequency domain, a particular time, or a particular spatial range. For example, a narrower executable network may represent a low frequency signal and a wider executable network may represent a high frequency detail, and if index B is used to represent the width of the executable network, then the smaller index B indicates the smaller width of the executable network. Wherein, alternatively, the width of the neural network may be equal to the number of channels (channels).
In this embodiment, the widths of the multiple executable networks included in the fluidized neural field or the fluidized neural network are from small to large, each executable network in the fluidized neural field or the fluidized neural network can only represent a specific part of a signal, the widths of the sub-networks are increased layer by layer in the training process, weights of newly added units and the previous network are removed, the newly added units are prevented from affecting the output of the sub-networks, the signal is gradually reconstructed in the dimensions of visual quality (or frequency domain), time, space and the like, and the required signal quality is provided as required by selecting the widths of the neural networks.
Based on this, in some alternative embodiments of the invention, one set of PDUs is used to transmit data for one type of streaming neural network and/or one or more PDUs in one set of PDUs is used to transmit data for a sub-network in the streaming neural network.
In some alternative embodiments, the PDU set is of the frequency domain type, time type or space type, and/or the fluid neural network is of the frequency domain type, time type or space type.
In this embodiment, a single trained neural network or streaming neural field may correspond to a set of PDUs, where the neural network or streaming neural field includes multiple executable networks (or sub-networks) with widths ranging from small to large, i.e., multiple PDUs may be included in the set of PDUs, where one or more PDUs may correspond to one executable network (or sub-network). The need for transport may also vary due to the different types of neural networks or fluidized neural fields (or different network characteristics). Therefore, in this embodiment, the neural network or the streaming neural field is divided into a frequency domain type, a time type or a space type, that is, the PDU set for carrying the network data of this type is also divided into a frequency domain type, a time type or a space type.
In some alternative embodiments, different types of PDU sets may be distinguished by different identities. For example, different Sequence Numbers (SN) may be used for identification, e.g., SN0 for frequency domain type, SN1 for time type, SN2 for space type, etc. Further, the network may set corresponding QoS relationship tables for different types of PDUs by defining or configuring them, as shown in table 1, so that the corresponding QoS policies of the PDUs (or data packets) belonging to the PDU set may be determined.
TABLE 1
Types of PDU setsFrequency domainSpace ofTime of
Index (Index)012
QoSQoS0QoS1QoS2
In other alternative embodiments, sub-networks (or executable networks) of different widths in the streaming neural network may also be distinguished by different identities, e.g., by sub-SNs, each sub-network (or executable network) being encoded independently. Similar to that shown in table 1, the sub-rules of the PDUs contained in each PDU set may be set under the QoS provisioning policy for that PDU set. For example, the first PDU in the PDU set representing the frequency domain type has the highest requirement for network transmission, and the index identifier may be added to the corresponding data packet and the transmission guarantee rule may be set separately. The sub-SNs in each PDU set are relatively independent, and even if the sub-SN corresponding to one PDU in the PDU set is 0, the sub-SNs can be regarded as different types of PDUs according to the difference of SNs in the PDU set to which the PDU belongs.
In some alternative examples, if the type of the streaming neural network (or neural network) is a frequency domain type, that is, a neural field representing a frequency domain, that is, when the PDU set corresponds to the neural field or the neural network of the frequency domain type, the neural network may reconstruct or recover signals of different qualities over time, the width of the sub-network may correspond to the height of the frequency domain, the narrower sub-network generates a low frequency signal, the wider sub-network generates a high frequency signal, the narrower sub-network and the wider sub-network belong to an inclusion relationship, and the wider sub-network only performs a portion where the narrower sub-network does not learn. For example, referring to fig. 2a, the second row of the drawing shows the width of the sub-networks, each sub-image in the first row shows the reconstructed or generated signal, wherein the first sub-network has the smallest width, which generates the corresponding low frequency signal, the second sub-network has the second width, which generates the corresponding intermediate frequency signal, and the third sub-network has the largest width, which generates the corresponding high frequency signal. It should be noted that, the intermediate frequency signal generated by the second sub-network is not included in the low frequency signal, and correspondingly, the high frequency signal generated by the third sub-network is not included in the intermediate frequency signal. It can be seen that the quality is the lowest for the low frequency signal and the highest for the intermediate frequency signal and the high frequency signal. The importance degree of the data corresponding to the first sub-network is the highest, the low-frequency signal can be recovered or rebuilt, the importance degree of the data corresponding to the latter sub-network is lower than the importance degree of the data corresponding to the first sub-network, and if the data corresponding to the first sub-network is lost, the signal can not be rebuilt or rebuilt even if the latter data transmission is successful. Therefore, during transmission, the network needs to preferentially ensure the transmission of the data packet corresponding to the first PDU in the PDU set, because the first PDU can recover or reconstruct the low-frequency image signal, and detects that the transmission of the data packet corresponding to other PDUs except the first PDU in the PDU set is unsuccessful and does not process, because the accuracy requirement of the other PDUs except the first PDU on the network transmission is lower than that of the first PDU, and if the other PDUs are detected to be lost, the other processing can be omitted.
In some alternative embodiments, if the type of the streaming neural network (or neural network) is a spatial type, i.e., representing a spatial neural field, i.e., the set of PDUs corresponds to a spatial type of neural field or neural network, the neural network may reconstruct or recover different portions of the signal over time, and the width of the sub-network may correspond to portions of the signal. For example, a narrower sub-network produces a portion of an image, a wider sub-network produces a larger portion of an image, and so on. The narrower subnetwork and the wider subnetwork belong to an inclusive relationship, and the wider subnetwork only performs the part of the narrower subnetwork that is not learned. For example, referring to fig. 2b, the second row in the drawing represents the width of the sub-network, and the shaded portion of each sub-graph in the first row represents the reconstructed or generated signal. Wherein the first sub-network has a smallest width that produces a portion of the corresponding signal, e.g. produces a portion of the image, the second sub-network has a second width that produces a larger portion of the corresponding signal, e.g. produces a larger portion of the image, and the third sub-network has a largest width that produces the complete signal, e.g. produces the complete image. It should be noted that the signal generated by the second sub-network is a part of the signal generated by the first sub-network, that is, a part of the signal is obtained by combining the part generated by the second sub-network with the part generated by the first sub-network, and correspondingly, the signal generated by the third sub-network is a part of the signal generated by the second sub-network, that is, a part of the signal generated by the third sub-network is combined with the part generated by the second sub-network to obtain a complete signal. Therefore, under the space type, the importance degree of the data corresponding to each sub-network is the same, and no matter which sub-network is lost, the complete signal cannot be restored or reconstructed. Therefore, when transmitting, the requirements of each PDU in the PDU set on network transmission are the same, and the loss of any PDU can lead to signal certainty, namely, the unsuccessful transmission of the data packet corresponding to any PDU in the PDU set is detected, the transmission of the data packet to the terminal is stopped, and the data packet which is not transmitted is discarded.
In some alternative embodiments, if the type of the streaming neural network (or neural network) is a time type, i.e. representing a time neural field, i.e. the PDU sets correspond to a neural field or neural network of a time type, the neural network may reconstruct or recover signals of different time domains over time, and the sub-networks of different widths may correspond to signals of different time domains. For example, a narrower sub-network generates the first few frames of an image, a wider sub-network generates more frames containing the first few frames, and so on. The narrower subnetwork and the wider subnetwork belong to an inclusive relationship, and the wider subnetwork only performs the part of the narrower subnetwork that is not learned. For example, referring to fig. 2c, the second row in the drawing represents the width of the sub-network, and each sub-graph of the first row represents the reconstructed or generated signal. For example, as shown in fig. 2c, the first sub-network has the smallest width, which produces a corresponding one-frame image, the second sub-network has the second largest width, which produces a corresponding two-frame image, and the third sub-network has the largest width, which produces a corresponding three-frame image. It should be noted that the signal generated by the second sub-network is not part of the signal generated by the first sub-network, that is, the two-frame image generated by the second sub-network includes one-frame image generated by the first sub-network, and the three-frame image generated by the third sub-network includes two-frame images reconstructed by the first sub-network and the second sub-network. It can be seen that, under the time type, the data corresponding to each sub-network has continuity. When transmitting, the network needs to preferentially ensure the transmission of the data packet corresponding to the first PDU in the PDU set, because the data packet corresponding to the first PDU represents the initial image signal, detecting that the transmission of the data packets corresponding to other PDUs except the first PDU in the PDU set is unsuccessful, retransmitting the data packets corresponding to other PDUs to the terminal, namely, the data packet corresponding to the subsequent PDU is lost, thereby causing the jump of the image, and retransmitting the PDU of the lost part is required to ensure the continuity of the signal.
In some alternative embodiments, the width of the plurality of sub-networks comprised by the fluidized neural network increases in sequence;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In this embodiment, the neural network or the streaming neural network may include sub-networks (or executable networks) of different widths, and the width B of the sub-network (or executable network) may be set. Because the spectral deviation indicates that the network learns the low frequency portion of the signal preferentially, the narrower subnetwork (or the executable network) after training represents the low frequency signal and the wider subnetwork (or the executable network) retains high frequency detail. Because users on different terminal devices need the original signals to be transmitted with different resolutions or qualities, the width of the neural network can be selected according to the requirements, and then corresponding QoS guarantee rules are configured for the sub-networks (or the variable networks) with different widths. Thus, different QoS treatment policies or QoS guarantee policies correspond to PDUs corresponding to sub-networks (or executable networks) of different widths or different width ranges in the same fluidized neural network belonging to the frequency domain type. Illustratively, higher QoS guarantees, e.g., greater bandwidth guarantees and higher packet loss rate thresholds, are set for sub-network (or alternative network) packets having high width values. The parameters involved in the QoS provisioning policy may include bandwidth, packet loss rate threshold, etc.
In some optional embodiments of the present invention, the adding the first indication information to the data packet at least based on the type of the PDU set includes adding the first indication information to the data packet based on the type of the PDU set and a width of a sub-network corresponding to the data packet.
In some optional embodiments, in a case that the type of the PDU set is a frequency domain type, the first indication information is used to indicate a first QoS treatment policy corresponding to a first width range of a sub-network corresponding to the data packet and the PDU set of the frequency domain type.
In some alternative embodiments, the first indication information is specifically an identification index of QoS rules of the data packet, and is used to identify or indicate a corresponding QoS policy or rule. The first indication information may be a QoS Flow Identifier (QFI), or an index value of a packet, or an Identifier (such as SN) indicating a type of the PDU set, or a combination of an Identifier indicating a type of the PDU set and a sub-Identifier identifying a width or a width range of a sub-network to which the PDU corresponds.
For example, for a PDU set of a frequency domain type, the sub SN is determined according to the width or the width range of the sub network (or the movable network) corresponding to each PDU in the PDU set, that is, the first indication information may represent the frequency domain type and the width or the width range of the sub network corresponding to the data packet or the first indication information may represent the QoS policy corresponding to the frequency domain type and the width or the width range of the sub network corresponding to the data packet. The first indication information may be represented by the SN corresponding to the frequency domain type in combination with the sub SN. Further, the network may determine the QoS policy corresponding to the packet by defining or configuring a QoS relationship table for the frequency domain type and corresponding to different bandwidths or bandwidth ranges, as shown in table 2. The total width of the L or fluidized neural network in table 2. The index (or first indication information) in table 2 is composed of two bits, the first bit "0" identifying the frequency domain type and the second bit having a different value representing the width range.
TABLE 2
Width range0-L/4L/4-L/2Greater than L/2
Index (Index)000102
QoSQoS00QoS01QoS02
In some optional embodiments, the first network function may add the first indication information in a general packet Radio Service (GPRS, general Packet Radio Service) tunnelling protocol (GTP, GPRS Tunneling Protocol) extension header, and in particular may add the first indication information in a GPRS tunnelling protocol (GTP-U, GPRS Tunnelling Protocol for the User plane) extension header of a user plane. For example, first indication information (or index value) may be added in a packet header of the downlink PDU session information (DL PDU Session Information) to indicate QoS policies corresponding to different PDUs or different types of PDU sets, and sent to an access network device (RAN) through an N3 interface.
In some optional embodiments, the data packet includes a second identifier, where the second identifier represents a first width range where a width of a sub-network corresponding to the data packet is located.
In this embodiment, before the data packet is sent from the cloud server (e.g., an application function), the cloud server (e.g., the application function) may divide the sub-network (or the network) according to the width according to the requirements of the terminal, and add a first identifier indicating a first width range where the width of the sub-network corresponding to the data packet is located in the data packet according to the width range. The first network function may determine, according to the second identifier carried by the data packet, that the width of the sub-network corresponding to the data packet is within the first width range. And further, according to the second identifier and the type of the PDU set to which the data packet belongs (i.e., the first identifier), a QoS policy or a corresponding index or identifier corresponding to the data packet can be determined.
In some alternative embodiments, the data packet comprises a first identifier, wherein the first identifier represents the type of the PDU set to which the data packet belongs, and the determining the type of the PDU set to which the data packet belongs comprises determining the type of the PDU set to which the data packet belongs based on the first identifier.
In this embodiment, before a data packet is sent from a cloud server (e.g., an application function), the cloud server (e.g., the application function) may classify a neural network or a streaming neural network, so as to determine a type of a PDU set corresponding to the neural network or the streaming neural network, and for each PDU or data packet in the PDU set, add a first identifier indicating the type of the PDU set, and after the first network function receives the data packet, determine the type of the PDU set to which the data packet belongs according to the first identifier in the data packet.
In some alternative embodiments, the method further comprises, prior to receiving the packet carrying the neural network data, receiving first information sent by a second network function, the first information comprising configuration information related to QoS treatment policies.
In this embodiment, the second network function may be a control plane network function for delivering QoS policies or rules. In 5G networks, the second network function may be a session management function (SMF, session Management Function), and in other networks, even future networks, the second network function may be any other network function with QoS policy or rule delivery function, which is not limited in this embodiment.
In this embodiment, the second network function issues or configures the determined QoS treatment policy or QoS guarantee policy to the first network function, so that the first network function can process and/or transmit the data packet according to the issued QoS treatment policy or QoS guarantee policy.
The embodiment of the invention also provides a data transmission method which is applied to the access network equipment. Fig. 3 is a second flow chart of a data transmission method according to an embodiment of the invention, and as shown in fig. 3, the method includes:
Step 201, receiving a data packet carrying neural network data sent by a first network function, wherein the data packet carries first indication information, and the first indication information is at least used for indicating a first QoS processing strategy corresponding to the type of a PDU set to which the data packet belongs;
and 202, processing the data packet carrying the same first indication information according to the first QoS processing strategy.
In this embodiment, the access network device may be a device that provides communication coverage for a terminal. Optionally, the access network device is a base station in each communication system, for example an evolved base station (eNB) in an LTE system, and further for example a base station (gNB) in a 5G system or an NR system, etc.
In this embodiment, the access network device receives, through the user plane, a data packet sent by the first network function, where the data packet carries first indication information. In some alternative embodiments, the first indication information is specifically an identification index of QoS rules of the data packet, and is used to identify or indicate a corresponding QoS policy or rule. In this embodiment, the QoS rule or policy indicated by the first indication information is associated with the type of PDU set described in the data packet. Wherein the types of different PDU sets correspond to different QoS treatment strategies.
In this embodiment, the neural network may also correspond to a neural field, i.e. a field that is parameterized in whole or in part by the neural network. In other alternative embodiments, the neural network may also correspond to a fluidized neural field or a fluidized neural network. The fluidized neural field or fluidized neural network may separate a single trained network into a plurality of different-width executable networks, which may reconstruct signals of different masses or different portions over time. An executable network (or subnetwork) can only produce signals in a particular frequency domain, a particular time, or a particular spatial range. For example, a narrower executable network may represent a low frequency signal and a wider executable network may represent a high frequency detail, and if index B is used to represent the width of the executable network, then the smaller index B indicates the smaller width of the executable network. Wherein, alternatively, the width of the neural network may be equal to the number of channels (channels).
Based on this, in some alternative embodiments of the invention, one set of PDUs is used to transmit data for one type of streaming neural network and/or one or more PDUs in one set of PDUs is used to transmit data for a sub-network in the streaming neural network.
In some alternative embodiments, the PDU set is of the frequency domain type, time type or space type, and/or the fluid neural network is of the frequency domain type, time type or space type.
In this embodiment, a single trained neural network or streaming neural field may correspond to a set of PDUs, where the neural network or streaming neural field includes multiple executable networks (or sub-networks) with widths ranging from small to large, i.e., multiple PDUs may be included in the set of PDUs, where one or more PDUs may correspond to one executable network (or sub-network). The need for transport may also vary due to the different types of neural networks or fluidized neural fields (or different network characteristics). Therefore, in this embodiment, the neural network or the streaming neural field is divided into a frequency domain type, a time type or a space type, that is, the PDU set for carrying the network data of this type is also divided into a frequency domain type, a time type or a space type.
In this embodiment, the specific description of the type of PDU set and the type of the neural network (or the streaming neural network) may refer to the specific description in the foregoing embodiment, and will not be repeated here.
In some alternative embodiments, the width of the plurality of sub-networks comprised by the fluidized neural network increases in sequence;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In this embodiment, the widths of the multiple executable networks included in the fluidized neural field or the fluidized neural network are from small to large, each executable network in the fluidized neural field or the fluidized neural network intelligently represents a specific part of a signal, the widths of the sub-networks are increased layer by layer in the training process, weights of newly added units and the previous network are removed, the newly added units are prevented from affecting the output of the sub-networks, the signal is gradually reconstructed in the dimensions of visual quality (or frequency domain), time, space and the like, and the required signal quality is provided as required by selecting the widths of the neural networks.
In this embodiment, the neural network or the streaming neural network may include sub-networks (or executable networks) of different widths, and the width B of the sub-network (or executable network) may be set. Because the spectral deviation indicates that the network learns the low frequency portion of the signal preferentially, the narrower subnetwork (or the executable network) after training represents the low frequency signal and the wider subnetwork (or the executable network) retains high frequency detail. Because users on different terminal devices need the original signals to be transmitted with different resolutions or qualities, the width of the neural network can be selected according to the requirements, and then the corresponding QoS guarantee rules are configured for the sub-networks (or the variable networks) with different widths. Thus, different QoS treatment policies or QoS guarantee policies correspond to PDUs corresponding to sub-networks (or executable networks) of different widths or different width ranges in the same fluidized neural network belonging to the frequency domain type. Illustratively, higher QoS guarantees, e.g., greater bandwidth guarantees and higher packet loss rate thresholds, are set for sub-network (or alternative network) packets having high width values. The parameters involved in the QoS provisioning policy may include bandwidth, packet loss rate threshold, etc.
In some alternative embodiments, the first QoS treatment policy includes a treatment policy associated with at least one of bandwidth, latency, packet loss rate threshold, packet priority.
In this embodiment, not limited to the first QoS treatment policy, the different QoS treatment policies corresponding to the types of the different PDU sets also include treatment policies related to at least one parameter including bandwidth, delay, packet loss rate threshold, and packet priority.
In some optional embodiments of the present invention, in a case where the type of the PDU set is a frequency domain type, the first indication information is used to indicate a first QoS treatment policy corresponding to a first width range of a sub-network corresponding to the data packet and the PDU set is of the frequency domain type.
In this embodiment, the first indication information may be a QoS Flow Identifier (QFI), or an index value of a data packet, or an identifier (such as SN) that indicates a type of a PDU set, or a combination of an identifier that indicates a type of a PDU set and a sub-identifier that identifies a width or a width range of a sub-network to which the PDU corresponds.
For example, for a PDU set of a frequency domain type, the sub SN is determined according to the width or the width range of the sub network (or the movable network) corresponding to each PDU in the PDU set, that is, the first indication information may represent the frequency domain type and the width or the width range of the sub network corresponding to the data packet or the first indication information may represent the QoS policy corresponding to the frequency domain type and the width or the width range of the sub network corresponding to the data packet. The first indication information may be represented by the SN corresponding to the frequency domain type in combination with the sub SN. Illustratively, the access network device may determine the QoS policy corresponding to the data packet by defining or configuring a QoS relationship table for the frequency domain type and corresponding to different bandwidths or bandwidth ranges, as shown in table 2 above.
In some optional embodiments, when the type of the PDU set is a frequency domain type, the method further includes preferentially guaranteeing transmission of a data packet corresponding to a first PDU in the PDU set, and/or detecting that transmission of data packets corresponding to other PDUs except the first PDU in the PDU set is unsuccessful and not processing.
In this embodiment, if the type of the fluid neural network (or neural network) is a frequency domain type, that is, the type of the fluid neural network indicates that the PDU set corresponds to the frequency domain type of the neural field or the neural network, the neural network may reconstruct or recover signals with different quality over time, the width of the sub-network may correspond to the height of the frequency domain, the narrower sub-network generates a low frequency signal, the wider sub-network generates a high frequency signal, the narrower sub-network and the wider sub-network belong to an inclusion relationship, and the wider sub-network only performs a part where the narrower sub-network does not learn, as shown in fig. 2 a. It can be seen that the quality is the lowest for the low frequency signal and the highest for the intermediate frequency signal and the high frequency signal. The importance degree of the data corresponding to the first sub-network is the highest, the low-frequency signal can be recovered or rebuilt, the importance degree of the data corresponding to the latter sub-network is lower than the importance degree of the data corresponding to the first sub-network, and if the data corresponding to the first sub-network is lost, the signal can not be rebuilt or rebuilt even if the latter data transmission is successful. Therefore, during transmission, the network needs to preferentially ensure the transmission of the data packet corresponding to the first PDU in the PDU set, because the first PDU can recover or reconstruct the low-frequency image signal, and detects that the transmission of the data packet corresponding to other PDUs except the first PDU in the PDU set is unsuccessful and does not process, because the accuracy requirement of the other PDUs except the first PDU on the network transmission is lower than that of the first PDU, and if the other PDUs are detected to be lost, the other processing can be omitted.
In other optional embodiments, when the type of the PDU set is a space type, the method further includes detecting that transmission of a data packet corresponding to any PDU in the PDU set is unsuccessful, stopping transmitting the data packet to the terminal, and discarding the data packet which is not transmitted.
In this embodiment, if the type of the fluid neural network (or neural network) is a spatial type, that is, a spatial neural field is represented, that is, when the PDU set corresponds to the spatial type of the neural field or the neural network, the neural network may reconstruct or recover different parts of the signal over time, and the width of the sub-network may correspond to the parts of the signal. For example, a narrower sub-network produces a portion of an image, a wider sub-network produces a larger portion of an image, and so on. The inclusion relationship between the narrower subnetwork and the wider subnetwork is such that the wider subnetwork performs only the portions of the narrower subnetwork that are not learned, as shown, for example, with reference to fig. 2 b. Therefore, under the space type, the importance degree of the data corresponding to each sub-network is the same, and no matter which sub-network is lost, the complete signal cannot be restored or reconstructed. Therefore, when transmitting, the requirements of each PDU in the PDU set on network transmission are the same, and the loss of any PDU can lead to signal certainty, namely, the unsuccessful transmission of the data packet corresponding to any PDU in the PDU set is detected, the transmission of the data packet to the terminal is stopped, and the data packet which is not transmitted is discarded.
In some optional embodiments, when the type of the PDU set is a time type, the method further includes preferentially guaranteeing transmission of a data packet corresponding to a first PDU in the PDU set, and/or detecting that transmission of data packets corresponding to other PDUs except the first PDU in the PDU set is unsuccessful, and retransmitting the data packets corresponding to the other PDUs to the terminal.
In this embodiment, if the type of the fluid neural network (or neural network) is a time type, that is, the type of the fluid neural network is a time neural field, that is, when the PDU set corresponds to the time type of the neural field or the neural network, the neural network may reconstruct or recover signals in different time domains over time, and the sub-networks in different widths may correspond to signals in different time domains. For example, a narrower sub-network generates the first few frames of an image, a wider sub-network generates more frames containing the first few frames, and so on. The inclusion relationship between the narrower subnetwork and the wider subnetwork is such that the wider subnetwork performs only those portions of the narrower subnetwork that are not learned, as shown, for example, with reference to fig. 2 c. It can be seen that, under the time type, the data corresponding to each sub-network has continuity. When transmitting, the network needs to preferentially ensure the transmission of the data packet corresponding to the first PDU in the PDU set, because the data packet corresponding to the first PDU represents the initial image signal, detecting that the transmission of the data packets corresponding to other PDUs except the first PDU in the PDU set is unsuccessful, retransmitting the data packets corresponding to other PDUs to the terminal, namely, the data packet corresponding to the subsequent PDU is lost, thereby causing the jump of the image, and retransmitting the PDU of the lost part is required to ensure the continuity of the signal.
In some alternative embodiments of the invention, the method further comprises receiving, via the third network function, second information from the second network function, the second information comprising configuration information related to QoS treatment policies.
In this embodiment, the second network function may be a control plane network function for delivering QoS policies or rules. In 5G networks, the second network function may be a Session Management Function (SMF), and in other networks or even future networks, the second network function may be any other network function having a QoS policy or rule delivery function, which is not limited in this embodiment. In the 5G network, the third network function may specifically be an access and mobility management function (AMF, ACCESS AND Mobility Management Function).
In this embodiment, the second network function issues or configures the determined QoS treatment policy or QoS guarantee policy to the access network device, so that the access network device can process and/or transmit the data packet according to the issued QoS treatment policy or QoS guarantee policy.
The embodiment of the invention also provides a data transmission method which is applied to the first network equipment. Fig. 4 is a flow chart diagram of a data transmission method according to an embodiment of the present invention, and as shown in fig. 4, the method includes:
Step 301, determining a first identifier corresponding to the type of a PDU set based on the type of a neural network, and adding the first identifier to a data packet belonging to the PDU set, wherein the data packet is used for bearing the data of the neural network;
And 302, transmitting the data packet added with the first identifier.
In this embodiment, the first network device may specifically be an Application Function (AF), an application server, or a cloud server.
In this embodiment, before the data packet is sent from the cloud server (e.g., an application function), the cloud server (e.g., the application function) may determine the type of the neural network or the streaming neural network, so as to determine the type of the PDU set corresponding to the neural network or the streaming neural network, and for each PDU or data packet in the PDU set, add a first identifier indicating the type of the PDU set. And after the data packet sent by the first network device reaches the first network function, the first network function can determine the type of the PDU set to which the data packet belongs according to the first identifier in the data packet after receiving the data packet.
In some alternative embodiments, one set of PDUs is used to transmit data for one type of streaming neural network and/or one or more PDUs in one set of PDUs is used to transmit data for a sub-network in the streaming neural network.
In some alternative embodiments, the PDU set is of the frequency domain type, time type or space type, and/or the fluid neural network is of the frequency domain type, time type or space type.
In some alternative embodiments, the width of the plurality of sub-networks comprised by the fluidized neural network increases in sequence;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In this embodiment, the specific description of the type of PDU set and the type of the neural network (or the streaming neural network) may refer to the specific description in the foregoing embodiment, and will not be repeated here.
In some optional embodiments of the present invention, in the case that the type of the PDU set is a frequency domain type, the method further includes determining a second identifier based on a width of a sub-network in the neural network, and adding the second identifier to a data packet corresponding to the sub-network, where different width ranges correspond to different second identifiers.
In this embodiment, before the data packet is sent from the cloud server (such as an application function), the cloud server (such as the application function) may divide the sub-network (or the executable network) according to the requirements of the terminal (or the user) according to the width, determine the width range where the width of the sub-network (or the executable network) is located, and add a first identifier indicating the first width range where the width of the sub-network corresponding to the data packet is located in the data packet according to the width range. The first network function can determine the width range of the sub-network corresponding to the data packet according to the second identifier carried by the data packet, and further can determine the QoS strategy or the corresponding index or identifier corresponding to the data packet according to the second identifier and the type of PDU set (i.e. the first identifier) to which the data packet belongs.
According to the technical scheme of the embodiment of the invention, the width of the sub-network (or the executable network) of the neural network can be set according to the user requirements, PDU sets corresponding to the sub-networks (or the executable networks) in the width range where different widths are located are differentially ensured according to corresponding QoS processing strategies, or QoS is ensured on the basis of the QoS processing strategies corresponding to the types of the PDU sets, for example, when the PDU sets are of the frequency domain type, data packet transmission corresponding to other PDUs except the first PDU in the PDU sets can not be processed successfully, for example, when the PDU sets are of the space type, transmission is stopped for any PDU in the PDU sets, transmission is stopped, the undelivered PDU is discarded, and the like, retransmission processing is not required for any data packet which is not successfully transmitted, so that the transmission efficiency of the data packet of the neural network on an air interface can be ensured, the utilization rate of air interface resources is improved, and the user experience is ensured.
The embodiment of the invention also provides a data transmission method which is applied to the second network function. Fig. 5 is a flow chart diagram of a data transmission method according to an embodiment of the present invention, and as shown in fig. 5, the method includes:
Step 401, determining QoS processing strategies corresponding to PDU sets of different types, wherein the types of the PDU sets are related to the types of the neural network;
Step 402, sending first information to a first network function and/or sending second information to an access network device through a third network function, wherein the first information and the second information both comprise configuration information related to QoS processing strategies.
In this embodiment, the second network function may be a control plane network function for delivering QoS policies or rules. In 5G networks, the second network function may be a Session Management Function (SMF), and in other networks or even future networks, the second network function may be any other network function having a QoS policy or rule delivery function, which is not limited in this embodiment. In a 5G network, the third network function may specifically be an access and mobility management function (AMF).
In this embodiment, taking the second network function as an example of the SMF, the SMF may obtain a control policy, rule or requirement related to session management from a policy control function (PCF, policy Control Function), and the SMF may determine a QoS processing policy, a QoS guarantee policy, and the like corresponding to different types of PDU sets according to the obtained related control policy, rule or requirement, and the QoS processing policy corresponding to different types of PCU sets may be different. And further transmitting the determined QoS processing strategy and the configuration information related to the QoS guarantee strategy to the first network function and the access network equipment, so that the first network function and the access network equipment can process and/or transmit the data packet according to the transmitted QoS processing strategy or QoS guarantee strategy. Wherein the type of the PDU set is related to the type of the neural network or the streaming neural network.
In some alternative embodiments, one set of PDUs is used to transmit data for one type of streaming neural network and/or one or more PDUs in one set of PDUs is used to transmit data for a sub-network in the streaming neural network.
In some alternative embodiments, the PDU set is of the frequency domain type, time type or space type, and/or the fluid neural network is of the frequency domain type, time type or space type.
In some alternative embodiments, the width of the plurality of sub-networks comprised by the fluidized neural network increases in sequence;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In this embodiment, the specific description of the type of PDU set and the type of the neural network (or the streaming neural network) may refer to the specific description in the foregoing embodiment, and will not be repeated here.
By adopting the technical scheme of the embodiment of the invention, on one hand, the PDU set bearing the neural network data is classified, for example, the PDU set is classified according to frequency domain, space and time, the data packet is processed according to the QoS processing strategy corresponding to the type of the PDU set, and the QoS enhancement management of the PDU set is carried out, so that a hierarchical transmission guarantee strategy aiming at the transmission of the neural network data is provided, the wireless air interface resource can be fully utilized, and the transmission efficiency is improved. On the other hand, the scheme can set the width of the sub-network of the neural network according to the user demands, and can guarantee QoS differentiation of network transmission aiming at PDUs (or data packets) corresponding to the sub-networks with different widths, so that the efficiency of transmitting the data of the neural network on an air interface can be guaranteed, the utilization rate of air interface resources is improved, and the user experience is guaranteed.
The data transmission method according to the embodiment of the present invention is described in detail below with reference to specific examples. In the following examples, an application scenario of a 5G network is described as an example, that is, a first network function is UPF, a second network function is SMF, a third network function is AMF, a first network device is AF, and an access network device is RAN.
Fig. 6 is a schematic diagram of an interaction flow of a data transmission method according to an embodiment of the present invention, and as shown in fig. 6, the method includes:
Step 500. A terminal (e.g., UE) initiates a procedure for establishing a PDU Session.
Step 501 AF establishes an AF session with the required QoS.
Step 502 PCF initiates PDU session modification.
Here, in the PCC rule generation process, the PCF considers packet processing requirements or policies at the PDU Set level. Specifically, the PCF may generate a packet processing requirement or policy for each type of PDU set or each PDU set with the PDU set as granularity, and in the process of step 502, issue the generated processing requirement or policy to the SMF, and send the corresponding processing requirement or policy to the UPF by the SMF, and send the processing requirement or policy to the RAN through the AMF.
Optionally, the PCF may also consider packet processing requirements or policies for PDU sets and sub-network wide combinations. Specifically, the PCF may generate, with the PDU set and the width range of the subnetwork as granularity, a packet processing requirement or policy corresponding to each type of PDU set or each PDU set and for each subnetwork loan range, and in the process of step 502, issue the generated processing requirement or policy to the SMF, where the SMF sends the corresponding processing requirement or policy to the UPF, and sends the processing requirement or policy to the RAN through the AMF.
In step 503, the UPF performs identification and marking processing of different types of PDU sets according to the N4 rule sent by the SMF (i.e. the processing requirement or policy issued to the UPF in the process of step 502), and adds the first indication information into the data packet.
Here, the first indication information is specifically an identification index of a QoS rule of the data packet, and is used for identifying or indicating a corresponding QoS policy or rule. The first indication information may be a QoS Flow Identifier (QFI), or an index value of a data packet, or an identifier (such as SN) indicating a type of the PDU set, or a combination of an identifier indicating a type of the PDU set and a sub-identifier identifying a width or a width range of a sub-network to which the PDU corresponds, or the like.
For example, the UPF may add the first indication information in the GTP packet header, and may specifically add the first indication information in the GTP-U extension packet header. For example, first indication information (or index value) may be added in a packet header of the downlink PDU session information (DL PDU Session Information) to indicate QoS policies corresponding to different PDUs or different types of PDU sets, and sent to the RAN through the N3 interface.
In step 504, the RAN identifies a PDU set according to the first indication information in the data packet, and performs transmission guarantee on PDUs in the PDU set.
For example, a relation table as shown in table 1 or table 2 may be preset or configured in the RAN, a QoS policy corresponding to the data packet is determined according to the relation table and the first indication information carried in the data packet, and transmission is guaranteed according to the determined QoS policy.
Fig. 7 is a second schematic diagram of an interaction flow of a data transmission method according to an embodiment of the present invention, where, as shown in fig. 7, the method includes:
Step 600, the UE initiates a service request and PDU session establishment procedure.
And 601, after receiving a service request of the UE, the AF or cloud server classifies the sub-networks according to the requirements of the UE and adds identifiers for data packets in PDU sets belonging to different types according to the width of the sub-networks.
In other examples, the width range of the sub-network may not be determined, and only the first identifier may be added to the data packet. The first identifier is used for identifying the type of the PDU set, and the second identifier is used for identifying the width range where the width of the sub-network is located, namely, different width ranges correspond to different second identifiers. For example, if the above-mentioned identifier includes a first identifier and a second identifier, taking 0 to represent a frequency domain type, 1 to represent a space type, 2 to represent a time type as an example, and taking 0 to represent 0-L/4, 1 to represent L/4-L/2, and 2 to represent greater than L/2 as examples, the above-mentioned identifier may include two bits, the first bit may be the first identifier, and the second bit may be the second identifier, 00 may represent a frequency domain type and a width range satisfies 0-L/4, and 01 may represent a frequency domain type and a width range satisfies L/4-L/2. The AF may add an identification to the packet according to the rules described above.
Here, the sub-networks are classified, that is, the neural network or the streaming neural network including the sub-networks is classified, thereby determining the type of the PDU set corresponding to the neural network or the streaming neural network. And one or more PDUs may be included in one PDU set, and one sub-network may correspond to one or more PDUs.
Step 602, SMF configures QoS guarantee strategy corresponding to the data packet according to the identification.
Here, the SMF may configure a corresponding QoS guarantee policy for each identifier according to an identifier rule defined by the AF or the cloud server. In one embodiment, after determining the identification rule, the AF or the cloud server may send the defined identifications and corresponding meanings to the SMF, and the SMF configures a corresponding QoS guarantee policy for each identification. As another implementation manner, the same identification rule as the AF or cloud server may be preconfigured in the SMF by a preconfiguration manner, and a corresponding QoS guarantee policy is configured for each identification.
Step 603 to step 605. The SMF sends QoS configuration information to the UPF, and the SMF sends QoS configuration information to the RAN via the AMF.
Here, the execution order of step 603 and step 604 is not limited to the order. The execution order of steps 601 and 602 is not limited to that shown in the present example, and step 602 may be executed before step 601, which is not limited to the present example.
The AF or cloud server sends the downlink data packet to the UPF, and the UPF sends the downlink data packet to the RAN.
Referring to the description in example one, after the UPF receives the downlink packet, the UPF performs identification and marking processing on the packet according to a policy or rule issued by the SMF, adds first indication information to the packet, for example, adds QFI to the packet, and sends the packet to the RAN through the user plane.
Step 606, the ran performs QoS transmission guarantee on the data packet, and mapping of QoS flows to data radio bearers (DRBs, data Radio Bearer) is completed by a service data adaptation protocol (SDAP, service Data Adaptation Protocol) entity. Specific QoS requirements (QoS configuration information obtained from SMF) may include bandwidth size of packet transmission, packet loss rate threshold of packets, packet priority, etc.
Based on the above embodiments, the embodiments of the present invention further provide a data transmission device, where the device is applied to a first network function. Fig. 8 is a schematic diagram of the composition structure of a data transmission device according to an embodiment of the present invention, and as shown in fig. 8, the device includes a first communication unit 11 and a first processing unit 12, wherein,
The first communication unit 11 is configured to receive a data packet carrying data of a neural network;
The first processing unit 12 is configured to determine a type of a PDU set to which the data packet belongs, and add first indication information to the data packet based at least on the type of the PDU set, where the first indication information is at least used to indicate a first QoS treatment policy corresponding to the type of the PDU set;
the first communication unit 11 is further configured to send a data packet to which the first indication information is added to an access network device.
In some alternative embodiments of the invention, one set of PDUs is used to transmit data for one type of streaming neural network and/or one or more PDUs in one set of PDUs is used to transmit data for a sub-network in the streaming neural network.
In some alternative embodiments of the invention, the type of PDU set is a frequency domain type, a time type, or a space type, and/or,
The type of the fluidized neural network is a frequency domain type, a time type or a space type.
In some alternative embodiments of the invention, the width of the plurality of sub-networks comprised by the fluidized neural network increases in sequence;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In some optional embodiments of the present invention, the first processing unit 12 is configured to add first indication information to the data packet based on a type of the PDU set and a width of a sub-network corresponding to the data packet.
In some optional embodiments of the present invention, in a case where the type of the PDU set is a frequency domain type, the first indication information is used to indicate a first QoS treatment policy corresponding to a first width range of a sub-network corresponding to the data packet and the PDU set is of the frequency domain type.
In some optional embodiments of the present invention, the data packet includes a first identifier, where the first identifier indicates a type of PDU set to which the data packet belongs;
The first processing unit 12 is configured to determine, based on the first identifier, a type of PDU set to which the data packet belongs.
In some optional embodiments of the present invention, the data packet includes a second identifier, where the second identifier represents a first width range where a width of a sub-network corresponding to the data packet is located.
The first communication unit 11 is configured to receive, before receiving a packet carrying data of the neural network, first information sent by the second network function, where the first information includes configuration information related to QoS treatment policy.
In the embodiment of the present invention, the first processing unit 12 in the device may be implemented by a central processing unit (CPU, central Processing Unit), a digital signal Processor (DSP, digital Signal Processor), a micro control unit (MCU, microcontroller Unit) or a Programmable gate array (FPGA), and the first communication unit 11 in the device may be implemented by a communication module (including a basic communication suite, an operating system, a communication module, a standardized interface, a protocol, etc.) and a transceiver antenna in practical application.
The embodiment of the invention also provides a data transmission device which is applied to the access network equipment. Fig. 9 is a schematic diagram of a second component structure of a data transmission device according to an embodiment of the present invention, and as shown in fig. 9, the device includes a second communication unit 21 and a second processing unit 22, wherein,
The second communication unit 21 is configured to receive a data packet carrying neural network data sent by a first network function, where the data packet carries first indication information, and the first indication information is at least used to indicate a first QoS treatment policy corresponding to a type of a PDU set to which the data packet belongs;
the second processing unit 22 is configured to process the data packet carrying the same first indication information according to the first QoS treatment policy.
In some alternative embodiments of the invention, one set of PDUs is used to transport data for one type of streaming neural network, and/or,
One or more PDUs in one set of PDUs are used for transmitting data of a sub-network in the streaming neural network.
In some alternative embodiments of the invention, the type of PDU set is a frequency domain type, a time type, or a space type, and/or,
The type of the fluidized neural network is a frequency domain type, a time type or a space type.
In some alternative embodiments of the invention, the width of the plurality of sub-networks comprised by the fluidized neural network increases in sequence;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In some optional embodiments of the present invention, in a case where the type of the PDU set is a frequency domain type, the first indication information is used to indicate a first QoS treatment policy corresponding to a first width range of a sub-network corresponding to the data packet and the PDU set is of the frequency domain type.
In some optional embodiments of the present invention, when the type of the PDU set is a frequency domain type, the second processing unit 22 is further configured to preferentially guarantee transmission of a data packet corresponding to a first PDU in the PDU set, and/or detect that transmission of data packets corresponding to other PDUs except the first PDU in the PDU set is unsuccessful, and do not perform processing.
In some optional embodiments of the present invention, when the type of the PDU set is a spatial type, the second processing unit 22 is further configured to detect that transmission of a data packet corresponding to any PDU in the PDU set is unsuccessful, stop transmitting the data packet to the terminal, and discard the data packet that is not transmitted.
In some optional embodiments of the present invention, when the type of the PDU set is a time type, the second processing unit 22 is further configured to preferentially ensure transmission of a data packet corresponding to a first PDU in the PDU set, and/or detect that transmission of data packets corresponding to other PDUs except for the first PDU in the PDU set is unsuccessful, and re-send the data packets corresponding to the other PDUs to the terminal through the second communication unit 21.
In some optional embodiments of the invention, the first QoS treatment policy comprises a treatment policy associated with at least one of the following parameters:
Bandwidth, delay, packet loss rate threshold, packet priority.
In some alternative embodiments of the invention, the second communication unit 21 is further configured to receive second information from a second network function via a third network function, where the second information includes configuration information related to QoS treatment policies.
In the embodiment of the invention, the second processing unit 22 in the device can be realized by CPU, DSP, MCU or an FPGA in practical application, and the second communication unit 21 in the device can be realized by a communication module (including a basic communication suite, an operating system, a communication module, a standardized interface, a protocol and the like) and a receiving and transmitting antenna in practical application.
The embodiment of the invention also provides a data transmission device which is applied to the first network equipment. Fig. 10 is a schematic diagram of the structure of a data transmission device according to an embodiment of the present invention, and as shown in fig. 10, the device includes a third processing unit 31 and a third communication unit 32, wherein,
The third processing unit 31 is configured to determine a first identifier corresponding to a type of a PDU set based on a type of a neural network, and add the first identifier to a data packet belonging to the PDU set, where the data packet is used to carry data of the neural network;
The third communication unit 32 is configured to send a data packet to which the first identifier is added.
In some alternative embodiments of the invention, one set of PDUs is used to transmit data for one type of streaming neural network and/or one or more PDUs in one set of PDUs is used to transmit data for a sub-network in the streaming neural network.
In some alternative embodiments of the invention, the type of PDU set is a frequency domain type, a time type, or a space type, and/or,
The type of the fluidized neural network is a frequency domain type, a time type or a space type.
In some alternative embodiments of the invention, the width of the plurality of sub-networks comprised by the fluidized neural network increases in sequence;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In some optional embodiments of the present invention, in the case where the type of the PDU set is a frequency domain type, the third processing unit 31 is further configured to determine a second identifier based on a width of a sub-network in the neural network, and add the second identifier to a data packet corresponding to the sub-network, where different width ranges correspond to different second identifiers.
In the embodiment of the invention, the third processing unit 31 in the device can be realized by CPU, DSP, MCU or an FPGA in practical application, and the third communication unit 32 in the device can be realized by a communication module (including a basic communication suite, an operating system, a communication module, a standardized interface, a protocol, and the like) and a transceiver antenna in practical application.
The embodiment of the invention also provides a data transmission device which is applied to the second network function. Fig. 11 is a schematic diagram of a data transmission device according to an embodiment of the present invention, and as shown in fig. 11, the device includes a fourth processing unit 41 and a fourth communication unit 42, wherein,
The fourth processing unit 41 is configured to determine QoS processing policies corresponding to different types of PDU sets, where the type of PDU set is related to the type of the neural network;
The fourth communication unit 42 is configured to send first information to the first network function and/or send second information to the access network device through the third network function, where both the first information and the second information include configuration information related to QoS treatment policy.
In some alternative embodiments of the invention, one set of PDUs is used to transport data for one type of streaming neural network, and/or,
One or more PDUs in one set of PDUs are used for transmitting data of a sub-network in the streaming neural network.
In some alternative embodiments of the invention, the type of PDU set is a frequency domain type, a time type, or a space type, and/or,
The type of the fluidized neural network is a frequency domain type, a time type or a space type.
In some alternative embodiments of the invention, the width of the plurality of sub-networks comprised by the fluidized neural network increases in sequence;
PDUs corresponding to sub-networks of different width ranges in the same fluidized neural network belonging to the frequency domain type correspond to different QoS processing strategies.
In the embodiment of the present invention, the fourth processing unit 41 in the device may be implemented by CPU, DSP, MCU or FPGA in practical application, and the fourth communication unit 42 in the device may be implemented by a communication module (including a basic communication suite, an operating system, a communication module, a standardized interface, a protocol, etc.) and a transceiver antenna in practical application.
It should be noted that, in the data transmission device provided in the foregoing embodiment, only the division of each program module is used for illustration, and in practical application, the processing allocation may be performed by different program modules according to needs, that is, the internal structure of the device is divided into different program modules, so as to complete all or part of the processing described above. In addition, the data transmission device and the data transmission method provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the data transmission device and the data transmission method are detailed in the method embodiments and are not repeated herein.
The embodiment of the invention also provides communication equipment, which can be specifically a first network function, an access network equipment, a second network function or a first network equipment. Fig. 12 is a schematic diagram of a hardware composition structure of a communication device according to an embodiment of the present invention, as shown in fig. 12, the communication device includes a memory 52, a processor 51, and a computer program stored in the memory 52 and executable on the processor 51, where the processor 51 implements steps of a data transmission method applied to a first network function, an access network device, a second network function, or a first network device when executing the program.
Optionally, the communication device further comprises at least one network interface 53. Wherein the various components of the communication device are coupled together by a bus system 54. It is understood that the bus system 54 is used to enable connected communications between these components. The bus system 54 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration the various buses are labeled as bus system 54 in fig. 12.
It will be appreciated that the memory 52 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. The non-volatile Memory may be, among other things, a Read Only Memory (ROM), a programmable Read Only Memory (PROM, programmable Read-Only Memory), erasable programmable Read-Only Memory (EPROM, erasable Programmable Read-Only Memory), electrically erasable programmable Read-Only Memory (EEPROM, ELECTRICALLY ERASABLE PROGRAMMABLE READ-Only Memory), Magnetic random access Memory (FRAM, ferromagnetic Random Access Memory), flash Memory (Flash Memory), magnetic surface Memory, optical disk, or compact disk-Only Memory (CD-ROM, compact Disc Read-Only Memory), which may be disk Memory or tape Memory. The volatile memory may be random access memory (RAM, random Access Memory) which acts as external cache memory. By way of example and not limitation, many forms of RAM are available, such as static random access memory (SRAM, static Random Access Memory), synchronous static random access memory (SSRAM, synchronous Static Random Access Memory), dynamic random access memory (DRAM, dynamic Random Access Memory), synchronous dynamic random access memory (SDRAM, synchronous Dynamic Random Access Memory), and, Double data rate synchronous dynamic random access memory (DDRSDRAM, double Data Rate Synchronous Dynamic Random Access Memory), enhanced synchronous dynamic random access memory (ESDRAM, enhanced Synchronous Dynamic Random Access Memory), synchronous link dynamic random access memory (SLDRAM, syncLink Dynamic Random Access Memory), direct memory bus random access memory (DRRAM, direct Rambus Random Access Memory). The memory 52 described in embodiments of the present invention is intended to comprise, without being limited to, these and any other suitable types of memory.
The method disclosed in the above embodiment of the present invention may be applied to the processor 51 or implemented by the processor 51. The processor 51 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 51 or by instructions in the form of software. The processor 51 may be a general purpose processor, DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor 51 may implement or perform the methods, steps and logic blocks disclosed in embodiments of the present invention. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiment of the invention can be directly embodied in the hardware of the decoding processor or can be implemented by combining hardware and software modules in the decoding processor. The software modules may be located in a storage medium in a memory 52. The processor 51 reads information in the memory 52 and, in combination with its hardware, performs the steps of the method as described above.
In an exemplary embodiment, the communication device may be implemented by one or more Application-specific integrated circuits (ASICs), DSPs, programmable logic devices (PLDs, programmable Logic Device), complex programmable logic devices (CPLDs, complex Programmable Logic Device), FPGAs, general purpose processors, controllers, MCUs, microprocessors, or other electronic elements for performing the aforementioned methods.
In an exemplary embodiment, the present invention also provides a computer readable storage medium, such as a memory 52, comprising a computer program executable by the processor 51 of the communication device to perform the steps of the method described above. The computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash Memory, magnetic surface Memory, optical disk, or CD-ROM, or various devices including one or any combination of the above.
The computer readable storage medium provided by the embodiment of the present invention stores a computer program thereon, which when executed by a processor implements the steps of the data transmission method of the embodiment of the present invention applied to the first network function, the access network device, the second network function, or the first network device.
The methods disclosed in the method embodiments provided by the application can be arbitrarily combined under the condition of no conflict to obtain a new method embodiment.
The features disclosed in the several product embodiments provided by the application can be combined arbitrarily under the condition of no conflict to obtain new product embodiments.
The features disclosed in the embodiments of the method or the apparatus provided by the application can be arbitrarily combined without conflict to obtain new embodiments of the method or the apparatus.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is merely a logical function division, and there may be additional divisions of actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described as separate components may or may not be physically separate, and components displayed as units may or may not be physical units, may be located in one place, may be distributed on a plurality of network units, and may select some or all of the units according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in each embodiment of the present invention may be integrated in one processing unit, or each unit may be separately used as a unit, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of hardware plus a form of software functional unit.
It will be appreciated by those of ordinary skill in the art that implementing all or part of the steps of the above method embodiments may be accomplished by hardware associated with program instructions, and that the above program may be stored on a computer readable storage medium which, when executed, performs the steps comprising the above method embodiments, where the above storage medium includes various media that can store program code, such as removable storage devices, ROM, RAM, magnetic or optical disks.
Or the above-described integrated units of the invention may be stored in a computer-readable storage medium if implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, the technical solutions of the embodiments of the present invention may be embodied in essence or a part contributing to the prior art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present invention. The storage medium includes various media capable of storing program codes such as a removable storage device, a ROM, a RAM, a magnetic disk or an optical disk.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (24)

CN202311280326.XA2023-09-282023-09-28 A data transmission method, device, communication equipment and storage mediumPendingCN119728047A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202311280326.XACN119728047A (en)2023-09-282023-09-28 A data transmission method, device, communication equipment and storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202311280326.XACN119728047A (en)2023-09-282023-09-28 A data transmission method, device, communication equipment and storage medium

Publications (1)

Publication NumberPublication Date
CN119728047Atrue CN119728047A (en)2025-03-28

Family

ID=95095764

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202311280326.XAPendingCN119728047A (en)2023-09-282023-09-28 A data transmission method, device, communication equipment and storage medium

Country Status (1)

CountryLink
CN (1)CN119728047A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108632308A (en)*2017-03-172018-10-09电信科学技术研究院control method, device, SMF, UPF, UE, PCF and AN
CN112152948A (en)*2019-06-282020-12-29华为技术有限公司Wireless communication processing method and device
CN112217615A (en)*2019-07-092021-01-12华为技术有限公司 A method and apparatus for supporting time-sensitive networks
CN112913280A (en)*2018-10-192021-06-04诺基亚通信公司Configuring quality of service
US20220361037A1 (en)*2021-05-052022-11-10Acer IncorporatedUser equipment and wireless communication method for neural network computation
CN115812297A (en)*2020-06-292023-03-17Oppo广东移动通信有限公司Wireless communication method, terminal equipment and network equipment
CN115967992A (en)*2021-10-082023-04-14华为技术有限公司Communication method, device and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108632308A (en)*2017-03-172018-10-09电信科学技术研究院control method, device, SMF, UPF, UE, PCF and AN
CN112913280A (en)*2018-10-192021-06-04诺基亚通信公司Configuring quality of service
CN112152948A (en)*2019-06-282020-12-29华为技术有限公司Wireless communication processing method and device
CN112217615A (en)*2019-07-092021-01-12华为技术有限公司 A method and apparatus for supporting time-sensitive networks
CN115812297A (en)*2020-06-292023-03-17Oppo广东移动通信有限公司Wireless communication method, terminal equipment and network equipment
US20220361037A1 (en)*2021-05-052022-11-10Acer IncorporatedUser equipment and wireless communication method for neural network computation
CN115967992A (en)*2021-10-082023-04-14华为技术有限公司Communication method, device and system

Similar Documents

PublicationPublication DateTitle
US11146462B2 (en)Network slice management method, device, and system
US10498659B2 (en)System and method for managing virtual radio access network slicing
CN107113637B (en) Method and module for managing packets in software-defined networking
KR102069141B1 (en)Service layer southbound interface and quality of service
CN114079618B (en) A communication method and a communication device
US12402035B2 (en)Communication method and apparatus
CN112567799A (en)Method and apparatus for managing network traffic in a wireless communication system
CN111200798B (en)V2X message transmission method, device and system
US20140341021A1 (en)Service rate control method, system and device
CN111586602B (en) Method and device for policy management
US12418463B2 (en)Information acquisition method and apparatus, storage medium, and electronic apparatus
CN111200565B (en)Information transmission method, terminal and network equipment
CN108353022A (en) Method, device and system for processing data message
CN105474579A (en)Control method and centralized controller in communication network and wireless communication network system
US20250211542A1 (en)Protocol data unit set transmission method and apparatus
IL267825B (en)Rlc layer status report control pdu transmitting method and related device
US20150245240A1 (en)Network resource modification
WO2022012361A1 (en)Communication method and apparatus
CN102742318B (en) Congestion control method, device and system
WO2019213856A1 (en)Method and apparatus for configuring drb integrity protection, and computer storage medium
CN107113186B (en)Method and common service entity for data transmission in unified machine-to-machine system
CN119728047A (en) A data transmission method, device, communication equipment and storage medium
CN115175242A (en)Communication method, network equipment and computer readable storage medium
GB2624512A (en)Methods and apparatus for handling AI/ML data
CN115515159B (en) A communication method and device

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp