Movatterモバイル変換


[0]ホーム

URL:


CN119544638B - A content pre-fetching method based on node business traffic profile - Google Patents

A content pre-fetching method based on node business traffic profile

Info

Publication number
CN119544638B
CN119544638BCN202411635718.8ACN202411635718ACN119544638BCN 119544638 BCN119544638 BCN 119544638BCN 202411635718 ACN202411635718 ACN 202411635718ACN 119544638 BCN119544638 BCN 119544638B
Authority
CN
China
Prior art keywords
node
content
demand
service
service content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202411635718.8A
Other languages
Chinese (zh)
Other versions
CN119544638A (en
Inventor
贾锋
车嵘
张晓波
郭乐勐
王因传
孔磊
陶波
段保松
王连清
杨林娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense TechnologyfiledCriticalNational University of Defense Technology
Priority to CN202411635718.8ApriorityCriticalpatent/CN119544638B/en
Publication of CN119544638ApublicationCriticalpatent/CN119544638A/en
Application grantedgrantedCritical
Publication of CN119544638BpublicationCriticalpatent/CN119544638B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

Translated fromChinese

本申请涉及内容分发网络技术领域,为了解决现有的预取技术存在预取内容效率不高的问题,公开了一种基于节点业务流量画像的内容预取方法、计算机设备、计算机可读存储介质及计算机程序产品。该方法包括构建节点业务流量画像;基于节点业务流量画像,建立节点业务内容的动态需求模型,节点业务内容的动态需求模型用于预测下一个访问周期内任一时刻节点对业务内容的动态需求向量;根据下一个访问周期内各时刻节点对业务内容的动态需求向量,生成节点业务内容预取列表;依据节点业务内容预取列表进行内容预取。采用本方法将提高对节点的业务内容的需求预测准确率和内容预取效率。

The present application relates to the technical field of content distribution networks. In order to solve the problem of low efficiency of pre-fetching content in existing pre-fetching technologies, a content pre-fetching method, computer equipment, computer-readable storage medium, and computer program product based on node business traffic profiles are disclosed. The method includes constructing a node business traffic profile; establishing a dynamic demand model for node business content based on the node business traffic profile, and the dynamic demand model for node business content is used to predict the dynamic demand vector of the node for business content at any time in the next access cycle; generating a node business content pre-fetching list based on the dynamic demand vector of the node for business content at each time in the next access cycle; and pre-fetching content based on the node business content pre-fetching list. The use of this method will improve the accuracy of demand prediction for node business content and the efficiency of content pre-fetching.

Description

Content prefetching method based on node service flow image
Technical Field
The present application relates to the technical field of content distribution networks, and more particularly, to a content prefetching method, a computer device, a computer readable storage medium, and a computer program product based on node traffic flow images.
Background
Aiming at the problems of high concurrency pressure of a central node, low forwarding efficiency of a backbone node, low transmission rate of an edge node and the like, a content delivery network (Content Delivery Network, CDN for short) is mainly utilized to cache service contents to a cache node (also called CDN node) close to a user at present, so that the nearby acquisition of the access contents of the node user is realized, and the content prefetching is introduced on the basis of the CDN cache technology, so that the flow pressure of the node is relieved to a certain extent. However, the dynamic change of the node service demand and the uncertainty of the node service flow lead to poor cache prefetching effect, and it is difficult to fundamentally reduce the node service flow pressure and ensure the node user service experience.
The existing caching technology (i.e. CDN caching technology) mainly responds by caching service contents to caching service equipment (i.e. caching nodes) which are closer to the nodes, forwarding the service requests to the nearest caching nodes through network request redirection, so that service access time is shortened, and node access pressure is relieved. Existing prefetching techniques typically pre-fetch the content of a particular URL a priori, either manually or periodically, based on the type of interest of the user or popularity of the cached content, to ensure the particular access needs of the node user.
However, the existing caching technology and the pre-fetching technology still have the following defects that the existing caching technology cannot solve the network pressure caused by new requests of users or access of new business contents, the existing pre-fetching technology only usually considers the interest types of node users or popularity of the cached contents, ignores the relevance of network topology and resource distribution, ignores the condition of network bandwidth or network business flow, is a resource acquisition mode for artificially predicting the access demands of future users, and has the problems of low pre-fetching resource efficiency, unreasonable pre-fetching time period, low pre-fetching content matching and the like.
Disclosure of Invention
In order to solve the above problems, the present invention provides a content prefetching method, a computer device, a computer readable storage medium and a computer program product based on node traffic image, which will improve the accuracy of demand prediction and the efficiency of content prefetching for the traffic content of the node.
To achieve the above object, according to a first aspect of the present invention, there is provided a content prefetching method based on a node traffic image, the method comprising:
constructing a node service flow image;
Based on the node service flow portraits, establishing a dynamic demand model of node service contents, wherein the dynamic demand model of the node service contents is used for predicting dynamic demand vectors of the node to the service contents at any moment in the next access period;
Generating a node service content prefetching list according to the dynamic demand vector of the node to the service content at each moment in the next access period;
And performing content prefetching according to the node service content prefetching list.
Further, constructing a node service flow image, which comprises collecting node network flow, extracting to obtain node service information, wherein the node service information comprises a flow source IP address, a destination IP address, a source port number, a destination port number and a transport layer protocol type, carrying out access volume statistics on node service according to the node service information to obtain the service content and the node service rule which are most interesting to the node user, and constructing the node service flow image based on the service content and the node service rule which are most interesting to the node user.
Further, a dynamic demand model of the node service content is built based on the node service flow portraits, the dynamic demand model of the node service content is built based on the node service flow distribution and the duration in the node service flow portraits, the demand degree of the node on the service content is determined, the weight of a historical access period is calculated by using a forgetting curve, the change process of the demand degree of the node on the service content along with the node service flow is described in an incremental mode, and the dynamic demand model of the node service content is built based on the weight of the historical access period and the change process of the demand degree of the node on the service content along with the node service flow.
Further, based on the node service flow distribution and duration in the node service flow portrait, determining the demand degree of the node on the service content, wherein the demand degree of the node on the service content is lower than a demand threshold value or the demand is reduced and the reduction speed is faster if the node does not appear any more or appears intermittently for a long time after the certain service flow appears and the period is longer, and the demand degree of the node on the service content is higher than the demand threshold value if the duration of the certain service flow of the node is longer than the set duration threshold value.
Further, generating a node service content prefetching list according to the dynamic demand vector of the node to the service content in each time in the next access period, including defining the priority of the service content according to the dynamic demand vector of the node to the service content in each time in the next access period, and adding each service content to the initial prefetching list according to the priority to generate the node service content prefetching list.
Further, according to the priority, each service content is added into an initial pre-fetching list to generate a node service content pre-fetching list, the method comprises the steps of calculating the maximum value of the demand degree of a node for the service content to obtain the highest priority, adding the content with the highest priority into the initial pre-fetching list, setting the demand degree of a node represented by a dynamic demand vector for the service content to be 0 when the number of the service content in the initial pre-fetching list reaches the upper limit, and carrying out priority calculation on the rest service content, and repeating the above processes until the demand degree of the node represented by the dynamic demand vector for each service content is 0 or the length of the initial pre-fetching list reaches the maximum value, so as to generate the node service content pre-fetching list.
Further, content prefetching is performed according to the node service content prefetching list, which comprises defining the demand value of the content by predicting the frequency of access of the node service content in the next period, if the storage space of the node is sufficient, content prefetching is sequentially performed according to the node service content prefetching list, if the storage space of the node is insufficient, the cached content is deleted and the content to be prefetched is prefetched from the node service content prefetching list under the condition that the demand value of the cached content is smaller than the demand value of the content to be prefetched, otherwise, the content to be prefetched is not prefetched, and when the prefetching operation is performed, if the service content to be prefetched exists in a plurality of nodes, comprehensive judgment is performed according to the node service request quantity, the node storage capacity and the node flow condition of the service content to be prefetched, and the node with the largest prefetching coefficient is selected to execute the prefetching operation.
According to a second aspect of the present invention there is also provided a computer device comprising a memory, a processor and a computer program stored on the memory, the processor executing the computer program to carry out the steps of any of the methods described above.
According to a third aspect of the present invention there is also provided a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of any of the methods described above.
According to a fourth aspect of the present invention there is also provided a computer program product comprising a computer program which, when executed by a processor, performs the steps of any of the methods described above.
In general, the above technical solutions conceived by the present invention, compared with the prior art, enable the following beneficial effects to be obtained:
(1) The invention provides a content prefetching method based on node service flow image, which is characterized by constructing a node service flow image, and establishing a dynamic demand model of node service content based on the node service flow image, wherein the model can predict the dynamic demand vector of node to service content at any moment in the next access period, and generates a node service content prefetching list according to the dynamic demand vector of node to service content at each moment in the next access period, and prefetching the content according to the node service content prefetching list, thereby realizing automatic prefetching of the node service content. Compared with the prior prefetching technology which mainly adopts a manual or timed mode to prefetch the content of the specific URL, the method and the device for prefetching the content automatically based on the node service content prefetching list, avoid the waste of human resources, and enable the prefetching process to be more efficient, thereby achieving the purpose of improving the prefetching efficiency of the content.
(2) The content prefetching method based on the node service flow portraits provided by the invention is used for accurately carrying out statistical analysis on service information covered by the nodes by collecting the node network flows, so as to construct accurate node service flow portraits.
(3) According to the content prefetching method based on the node service flow image, the demand degree of the node service flow image on the service content is mapped, the weight of the historical access period is calculated by using a forgetting curve, the change process of the demand degree of the node on the service content along with the node service flow is described in an incremental mode, and therefore a dynamic demand model of the node service content is built. Compared with the prior prediction recommendation system which treats the historical access period information equally, the method lacks timeliness quantitative analysis, the demand prediction of the service content of the node is dynamically changed along with time, and the demand prediction accuracy of the service content of the node is higher.
(4) The content prefetching method based on the node service flow image provided by the invention can prefetch the service content in time according to the priority by combining the demand prediction result of the service content of the node in the node flow valley period, so that the content is prefetched in advance before the user actually requests the content, the peak flow is transferred, the peak-clipping and valley-flattening effect is achieved, the distribution cost is saved, the service time delay is reduced, and the service quality degradation caused by the dynamic change of the network is eliminated. Compared with the existing prefetching technology, consideration of conditions such as node network bandwidth, node traffic load and the like is often ignored, especially the edge node network has large fluctuation, network congestion is easily caused, network quality and user experience effect are seriously affected, and the network resource utilization efficiency and service content prefetching efficiency are improved to the greatest extent.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow diagram of a content prefetching method based on node traffic flow image according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating steps for creating a dynamic demand model of node service content according to an embodiment of the present application;
fig. 3 is a schematic diagram of an internal structure of a computer device according to an embodiment of the present application.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. In addition, the technical features of the embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
The terms first, second, third and the like in the description and in the claims and in the above drawings, are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
As shown in fig. 1, a method for prefetching content based on a traffic image of a node is provided, which may be performed by a node cache server or by other terminal devices communicating with the node via a network. The terminal device may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, etc. The node cache server may be a stand-alone server or implemented using a server cluster composed of a plurality of servers. The method is applied to the node cache server for illustration, and comprises the following steps:
step 101, obtaining the service flow portrait of the node.
The node cache server collects the node network flow (i.e. the whole flow of the node network) by taking a day as a unit, extracts the node service information covering the identifications of the flow source IP address, the destination IP address, the source port number, the destination port number, the transmission layer protocol type and the like, counts the access volume of the node service according to the access period to obtain the service content which is most interesting for the node user, summarizes the node service rule, and constructs the node service flow portrait within the operation time, wherein the node service flow is the flow generated by a specific service of the node.
Step 102, based on the node service flow portrait, a dynamic demand model of the node service content is established.
The dynamic demand model of the node business content is used for predicting the dynamic demand vector of the node to the business content at any moment in the next access period.
Illustratively, the node cache server performs dynamic demand modeling of node traffic content based on the node traffic volume image. And mapping the degree of the demand of the node on the service and the change process of the demand by analyzing factors such as the distribution of the node service flow, the duration and the like in the node service flow portrait.
The superposition analysis of the conditions of a plurality of periods of node service requirements is considered, the weight of a historical access period is calculated based on a forgetting curve, the change process of the service content requirement degree of the node along with the flow is described by adopting an incremental method, and the dynamic demand vector of the node to the service content at a certain moment in the next access period is predicted.
And step 103, generating a node service content prefetching list according to the dynamic demand vector of the node to the service content at each moment in the next access period.
The node cache server defines the priority of the service content according to the dynamic demand vector of the node to the service content at each moment in the next access period, and adds each service content into the initial pre-fetching list according to the priority to obtain the final node service content pre-fetching list. In the process of adding each service content to the initial pre-fetch list, in order to prevent overload from being added to the service content with higher demand, the calculation mode of the service content priority is changed according to the proportion of the added service content.
Step 104, executing prefetched content on the node service content prefetching list.
Illustratively, the node cache server defines the demand value of the content by predicting the frequency with which node traffic content is accessed in the next cycle, and performs the prefetch content operation by comparing the demand value of the cached content with the demand value of the content to be prefetched.
If the storage space of the node is sufficient, the content prefetching is sequentially carried out according to the node service content prefetching list, if the storage space of the node is insufficient, the cached content is deleted and prefetched from the node service content prefetching list under the condition that the demand value of the cached content is smaller than that of the content to be prefetched, otherwise, the content to be prefetched is not prefetched, and in addition, when the prefetching operation is carried out, if the service content to be prefetched exists in a plurality of nodes, the comprehensive judgment is carried out according to the node service request quantity, the node energy storage capacity and the node flow condition of the service content to be prefetched, and the node with the largest prefetching coefficient is selected to carry out the prefetching operation. The prefetch coefficient is a measure of the prefetch operation performed on a plurality of nodes, and the larger the prefetch coefficient is, the higher the prefetch operation success rate is.
The prefetching content operation is to find node traffic dips based on node traffic portraits according to the node network traffic conditions, and execute according to the node traffic content prefetching list when the traffic is smaller or dips.
In the content prefetching method based on the node service flow image, the node service flow image is constructed, the dynamic demand model of the node service content is constructed based on the node service flow image, the model can predict the dynamic demand vector of the node to the service content at any moment in the next access period, a node service content prefetching list is generated according to the dynamic demand vector of the node to the service content at each moment in the next access period, and prefetching content is executed on the list, so that automatic prefetching of the node service content is realized, and the aim of improving the prefetching content efficiency is fulfilled.
In one embodiment, as shown in fig. 2, the step 102, based on the node traffic portrayal, builds a dynamic demand model of the node traffic content, including the steps of:
step 201, determining the demand degree of the node on the service content based on the node service flow distribution and the duration in the node service flow portrait;
Step 202, calculating the weight of a historical access period by using a forgetting curve;
Step 203, describing the change process of the demand degree of the node on the service content along with the service flow of the node in an incremental mode;
step 204, based on the weight of the history access period and the change process of the node demand degree of the service content along with the node service flow, a dynamic demand model of the node service content is established.
In this embodiment, the demand prediction of the node service content by the dynamic demand model of the node service content dynamically changes with time, so that the demand prediction accuracy of the node service content can be improved.
In one embodiment, a method for prefetching content based on node traffic image includes the steps of:
(1) And constructing a node service flow image.
And collecting the node network flow by taking the day as a unit, carrying out access volume statistics on the node service according to the access period, summarizing the node service rule, and constructing a node service flow image.
(2) Node service dynamic demand modeling based on node service traffic portraits
The service demand level of the node service flow portrait mapping node is mainly considered as following factors, namely 1) if certain service flow of the node does not appear any more in a long time or is intermittent and has long period, the demand level of the service content is lower (namely lower than a certain set demand threshold value) or the demand is rapidly reduced, if certain service flow of the node is longer or the flow is smaller but the duration is longer (namely the duration is longer than a certain set duration threshold value), the demand level of the service content is higher (namely higher than the set demand threshold value), 2) the demand level of the node is superposition of influence of a plurality of period access conditions, and 3) the access duration of the node to the service content is a continuous reduction process.
The weight of the historical access period is calculated by using the forgetting curve, the change process of the demand degree of the node to the service content along with the flow is described by adopting an incremental method, the dynamic demand vector of the node to the service content at a certain moment in the next access period is predicted, and the demand vector reflects the overall demand condition of the node service.
(3) Generating node service content prefetch lists
1) According to the demand degree of the node on the service content, defining the priority of the content, namely calculating the priority of the content with the highest demand degree in the demand vector of the node;
2) And adding the content into the pre-fetching list according to the priority of the content, namely adding the content with the highest priority into the pre-fetching list according to the priority of the content. In the process of adding the content to the pre-fetch list, in order to prevent overload from being added to the content with higher demand, the calculation mode of the priority of the content is changed according to the proportion of the added content, for example, when the number of certain business content in the list reaches the upper limit, the demand degree of the corresponding business content in the demand vector for calculating the priority is set to 0, and then priority calculation is carried out on the rest business content;
3) And repeating the process until the demand degree of the service content in the demand vector for calculating the priority is 0 or the length of the prefetching list reaches the maximum, and forming a final node service content prefetching list.
(4) Prefetching node traffic content
The node cache server executes prefetching content operation according to the prefetching list of the node service content when the traffic is smaller or low according to the traffic condition of the node network, and the detailed flow is as follows:
1) The node cache server counts the access records of the node users once every other period to obtain the demand degree of each service content, and the dynamic demand vector of the node to the service content is obtained through calculation;
2) The node cache server calculates the priority of the content according to the dynamic demand vector, and obtains a node service content prefetching list according to the priority;
3) The node cache server executes prefetching content operation according to the node service content prefetching list, and the demand value of the content is defined by predicting the frequency of accessed in the next period of the node content. In the prefetching process, if the storage space of the node is sufficient, the content is prefetched in sequence according to the node service content prefetching list, if the storage space of the node is insufficient, but the demand value of the cached content is smaller than the demand value of the content to be prefetched, the cached content is deleted, and the content to be prefetched is prefetched from the node service content prefetching list;
4) When the node cache server executes the prefetching operation, if the to-be-prefetched service content exists in a plurality of nodes, comprehensive judgment is carried out according to the node service request quantity, the node storage capacity and the node flow condition of the to-be-prefetched service content, and the to-be-prefetched service node with the largest prefetching coefficient (namely the optimal) is selected to execute the prefetching of the service content. The prefetch content may come from different multiple nodes, and the prefetch coefficient is a measure of the prefetch operation performed on the multiple nodes, and the larger the prefetch coefficient, the higher the prefetch operation success rate.
Through tests, by adopting the content prefetching method based on the node service flow image, which is provided by the embodiment, the demand prediction accuracy of the service content of the node is not lower than 80%, and the automatic prefetching of the node service content is realized, so that manual operation is avoided, and the content prefetching efficiency is improved by not lower than 50%.
The application also provides a computer device, the internal structure of which can be shown in fig. 3. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program, when executed by a processor, implements a method for prefetching content based on node traffic images. The display unit of the computer equipment is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be a key, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in FIG. 3 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
As shown in fig. 3, the present application further provides a computer device, which includes a memory, a processor, and a computer program stored on the memory, where the processor executes the computer program to implement the steps in the above-mentioned method embodiments.
The application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method embodiments described above. The computer-readable storage medium may include, among other things, any type of disk including floppy disks, optical disks, DVDs, CD-ROMs, micro-drives, and magneto-optical disks, ROM, RAM, EPROM, EEPROM, DRAM, VRAM, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data.
The application also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
The foregoing is merely exemplary embodiments of the present disclosure and is not intended to limit the scope of the present disclosure. That is, equivalent changes and modifications are contemplated by the teachings of this disclosure, which fall within the scope of the present disclosure. Embodiments of the present disclosure will be readily apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a scope and spirit of the disclosure being indicated by the claims.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
It will be readily appreciated by those skilled in the art that the foregoing description is merely a preferred embodiment of the invention and is not intended to limit the invention, but any modifications, equivalents, improvements or alternatives falling within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (8)

The method comprises the steps of determining the demand degree of a node on service content based on node service flow distribution and duration in a node service flow portrait, calculating the weight of a historical access period by using a forgetting curve, describing the change process of the demand degree of the node on the service content along with the node service flow in an incremental mode, and establishing a dynamic demand model of the node service content based on the weight of the historical access period and the change process of the demand degree of the node on the service content along with the node service flow, wherein the dynamic demand model of the node service content is used for predicting the dynamic demand vector of the node on the service content at any moment in the next access period, and the demand vector reflects the overall demand condition of the node service;
CN202411635718.8A2024-11-152024-11-15 A content pre-fetching method based on node business traffic profileActiveCN119544638B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202411635718.8ACN119544638B (en)2024-11-152024-11-15 A content pre-fetching method based on node business traffic profile

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202411635718.8ACN119544638B (en)2024-11-152024-11-15 A content pre-fetching method based on node business traffic profile

Publications (2)

Publication NumberPublication Date
CN119544638A CN119544638A (en)2025-02-28
CN119544638Btrue CN119544638B (en)2025-09-30

Family

ID=94699201

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202411635718.8AActiveCN119544638B (en)2024-11-152024-11-15 A content pre-fetching method based on node business traffic profile

Country Status (1)

CountryLink
CN (1)CN119544638B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN118784720A (en)*2024-07-252024-10-15中国民用航空飞行学院 A pre-fetching adaptive intelligent caching method based on machine learning
CN118870069A (en)*2024-07-192024-10-29天之洁智慧科技(盐城)有限公司 An AI-based intelligent tuning method for multimedia system integration

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
EP2237518A1 (en)*2009-03-302010-10-06Mitsubishi Electric CorporationPre-pushing of popular content in a network of end-user equipments
WO2012050913A1 (en)*2010-09-282012-04-19The Ohio State UniversityPredictive network system and method
US10063653B2 (en)*2014-12-292018-08-28Akamai Technologies, Inc.Distributed server architecture for supporting a predictive content pre-fetching service for mobile device users
CN107959640B (en)*2016-10-142020-07-07腾讯科技(深圳)有限公司Network service scheduling method and device
US20180176325A1 (en)*2016-12-152018-06-21Huawei Technologies Co., Ltd.Data pre-fetching in mobile networks
US10749894B2 (en)*2017-02-152020-08-18Cisco Technology, Inc.Prefetch intrusion detection system
CN109195180A (en)*2018-07-202019-01-11重庆邮电大学A kind of solution for reducing content in mobile content central site network and obtaining time delay
CN109857934A (en)*2019-01-212019-06-07广州大学Software module cache prefetching method, apparatus and medium based on user behavior analysis
KR102085838B1 (en)*2019-09-272020-05-26에스케이텔레콤 주식회사Method for providing content by means of preloading and apparatus thereof
CN114124733B (en)*2020-08-272024-05-14中国电信股份有限公司Service flow prediction method and device
CN115576973B (en)*2022-09-302023-04-11北京领雾科技有限公司Service deployment method, device, computer equipment and readable storage medium
CN117492854A (en)*2023-06-282024-02-02马上消费金融股份有限公司Resource preloading method and device, electronic equipment and computer readable storage medium
CN117009690A (en)*2023-07-032023-11-07唯科终端技术(深圳)有限公司Method and system for preloading content
CN117596133B (en)*2024-01-182024-04-05山东中测信息技术有限公司Service portrayal and anomaly monitoring system and monitoring method based on multidimensional data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN118870069A (en)*2024-07-192024-10-29天之洁智慧科技(盐城)有限公司 An AI-based intelligent tuning method for multimedia system integration
CN118784720A (en)*2024-07-252024-10-15中国民用航空飞行学院 A pre-fetching adaptive intelligent caching method based on machine learning

Also Published As

Publication numberPublication date
CN119544638A (en)2025-02-28

Similar Documents

PublicationPublication DateTitle
CN110213627B (en)Streaming media cache allocation method based on multi-cell user mobility
CN116915686B (en) Heterogeneous multi-edge cloud collaborative microservice deployment and routing joint optimization method and system
US20170142177A1 (en)Method and system for network dispatching
CN109982104B (en) A mobile-aware video prefetch and cache replacement decision method in mobile edge computing
CN110933139A (en) A system and method for solving high concurrency of web server
US20160072911A1 (en)Physical location influenced caching
CN112737823A (en)Resource slice allocation method and device and computer equipment
CN111327461A (en)Domain name management method, device, equipment and medium based on CDN system
CN109413694B (en) A small cell caching method and device based on content popularity prediction
Liu et al.Deep learning video analytics through online learning based edge computing
CN113778675A (en)Calculation task distribution system and method based on block chain network
CN114584801A (en)Video resource caching method based on graph neural network recommendation algorithm
CN116886619A (en)Load balancing method and device based on linear regression algorithm
CN111385142A (en)Kubernetes-based adaptive web container stretching method
CN114465915A (en)CDN bandwidth prediction method, device and system and electronic equipment
CN108769253B (en)Adaptive pre-fetching control method for optimizing access performance of distributed system
CN111465057A (en)Edge caching method and device based on reinforcement learning and electronic equipment
CN118503512A (en)Large-scale network public opinion oriented elastic search retrieval optimization system
CN111935025B (en)Control method, device, equipment and medium for TCP transmission performance
CN117112132A (en) Dynamic expansion and contraction methods, devices, equipment and storage media
CN119226640B (en)Page update data processing method, system and equipment based on RN and H5
CN119544638B (en) A content pre-fetching method based on node business traffic profile
CN103380611B (en)Method and device for cache management
CN117421069B (en) A dynamic loading method, system, device and storage medium for Internet applet
CN115051996B (en)Video cache management method based on local video utility value under multi-access edge calculation

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp