A kind of data cache method and data buffering system of network requestTechnical field
The present invention relates to network field more particularly to the data cache methods and data buffering system of a kind of network request.
Background technique
With the development of information network technique, the various clients for different application developed in succession.In order to mentionThe performance of high client, it will usually increase data buffer storage layer in client or increase data Layer on the server.These clientsEnd is from obtaining data in advance and be saved in memory in server;When client sends request of data, client can be fromData are retrieved in data buffer storage and return to corresponding data to client.
Certainly, preparatory batch processing number for data is mainly used for the traditional data caching method of above-mentioned data buffer storageAccording to caching method and real-time filling data cache method for data.For example, when using preparatory batching data caching methodWhen, before client starts request data, it will fill data buffer storage, in large quantity with this for filling batchA little data can satisfy the needs of request of data transmitted by client;When using filling data cache method in real time, clientAfter sending request of data, server is needed first to be stored these data into data buffer storage according to the request of client, then againOneself requested data is got from data buffer storage by client.
However, above two traditional data caching method still has some shortcomings: preparatory batching data cachingMethod, actually a kind of data buffer storage of blindness, when client needs to request mass data, data buffer storage cannot will ownData save in memory, and can also occupy a large amount of memory headroom;When using real-time filling data cache method, forThe first data request of client can consume a large amount of response times, cause the delay of client requested data, it is difficult to meetThe real-time of client fetched data.
Summary of the invention
Primary technical problem to be solved by this invention is to provide a kind of data of network request for the above-mentioned prior artCaching method.
Further technical problem to be solved by this invention is to provide a kind of number of network request for the above-mentioned prior artAccording to caching system.
The present invention solves technical solution used by above-mentioned primary technical problem are as follows: a kind of data buffer storage side of network requestMethod, for realizing the data communication between client and server, which is characterized in that include the following steps 1 to step 5:
Step 1, increase setting data buffer storage device, between a client and a server to be connect by the data buffer storage deviceReceive each data request information transmitted on network;Wherein, each data request information, which has, characterizes the request of dataThe URL of information is identified;
Step 2, the data buffer storage device is intercepted to the URL mark of each data request information and caches each URL and markedKnow and each URL identifies corresponding reply data into the data buffer storage device, establishes for data cached solicited messageCache database;Wherein, data cached in the cache database includes each data request information, corresponding each request of dataThe URL of information is identified and the reply data of corresponding each URL mark;
Step 3, when client needs to send data request information to server, client sends data request information to instituteData buffer storage device is stated, is judged by the data buffer storage device:
When the corresponding URL mark of the current data solicited message transmitted by the client is located in the cache database, instituteThe reply data for stating the current URL mark of correspondence that data buffer storage device has cached it is sent to client;Otherwise, the dataData request information transmitted by client is transmitted to server by buffer storage;
Step 4, the data request information for the client that server is forwarded according to received data buffer storage is somebody's turn to do correspondingThe reply data of data request information URL mark is sent to the data buffer storage device and is cached;
Step 5, the data buffer storage device by after caching transmitted by server Lai correspondence reply data be sent toClient, to realize that client receives its requested data.
With improvement, in the data cache method of the network request, in server update reply data, the serviceThe URL mark of the reply data of update and the corresponding update reply data is sent to data buffer storage device by device;The dataThe URL mark of reply data and corresponding reply data in its cache database is updated by buffer storage;In clientAgain when request data, the data buffer storage device has updated it in cache database and newest the answering of corresponding URL markAnswer is according to client is sent to, to realize the real-time update of data in data buffer storage device.
Further, in the data cache method of the network request, for the data buffer storage device, the cachingAfter reply data in database updates, expired reply data is made delete processing by the data buffer storage device, with sectionSpatial cache about in the data buffer storage device.
Further, in the data cache method of the network request, in the data buffer storage device, the cachingIt is data cached including at least at least one of audio data, video data, image data and lteral data in database.
Still further, having in the cache database in the data cache method of the network request and being directed to instituteState the data cached caching subspace of different type;The caching subspace includes audio data caching subspace, video dataIt caches subspace, image data caching subspace and lteral data and caches subspace.
It improves again, the data cache method of the network request further include: to the audio data, video data, imageData and lteral data assign different cache priority grades respectively;When needing to update the data in the cache database,According to the cache priority grade assigned, to the audio data, video data, image data and lteral data according to successiveSequence is updated and caches.
It is further improved, the data cache method of the network request further include: the data buffer storage device caches itEach data are counted by the frequency that client accesses in database, to acquire the accessed frequency of each data;The numberAccording to buffer storage according to the size order of accessed frequency, each data are accordingly assigned with different update priority, it will be byThe big high priority data of access frequency is updated.
The present invention solves technical solution used by above-mentioned further technical problem are as follows: a kind of data buffer storage of network requestSystem, including client and server, which is characterized in that the data buffering system further includes data buffer storage device, the numberIt is communicated to connect respectively with client and server according to buffer storage;Wherein:
The client gives data buffer storage device to send data request information;Wherein, the data request information toolThere is the URL mark for characterizing the data request information;
The data buffer storage device, to receive the data request information that client is sent, the number sent to clientJudged according to solicited message, cached and forwarding server is sent to the reply data of client;Wherein, client is sentData request information to carry out judgement include: that the corresponding URL mark of the current data solicited message transmitted by the client is located at instituteWhen stating in cache database, the reply data of the current URL mark of the correspondence that the data buffer storage device has cached it is sent toClient;Otherwise, data request information transmitted by client is transmitted to server by the data buffer storage device;
The server, to the data request information of the client forwarded according to data buffer storage device, by corresponding visitorThe reply data of the data request information URL mark at family end is sent to the processing of data buffer storage device.
With improvement, it in the data buffering system of the network request, is also pushed away with cloud in the data buffer storage deviceSend device;Wherein, the cloud driving means, to send newest reply data updated in the data buffer storage deviceTo client, to realize the real-time update of data in data buffer storage device.
Compared with the prior art, the advantages of the present invention are as follows:
Firstly, by increasing setting data buffer storage device between a client and a server, data buffer storage device receives, is slowThe data request information transmitted on network is deposited, establishes the cache database for data cached solicited message, and slow at thisIt include that each data request information, the URL mark of corresponding each data request information and corresponding each URL are identified in deposit data libraryReply data;Data buffer storage device judges that the URL mark of data request information transmitted by client is present in cache databaseWhen, the reply data of the correspondence that it has been cached current URL mark is sent to client, avoid traditional caching method need byServer feedback reply data postpones to caused by client;Otherwise, the request of data buffer storage device sends corresponding be somebody's turn to do by serverThe reply data of URL mark gives the data buffer storage device, and is transmitted to client, to realize that it is requested that client receives itsData;
Secondly, in server update reply data, by the answer number in cache database in data buffer storageAccording to being updated, to ensure that client can obtain newest data in real time;
Again, excellent by assigning different cachings respectively to audio data, video data, image data and lteral dataFirst grade;When needing to update the data in cache database, according to the cache priority grade assigned, to audio data, videoData, image data and lteral data are updated and cache according to sequencing, to make full use of network, improve network benefitWith rate, avoids network speed and be good or bad the adverse effect for obtaining data to client;
Data buffer storage device is counted each data in its cache database by the frequency that client accesses, to obtainTo the accessed frequency of each data;And data buffer storage device is according to the size order of accessed frequency, accordingly to each dataDifferent update priority is assigned, the big high priority data of accessed frequency is updated, client is improved and is obtained in serverThe efficiency for evidence of fetching;
Finally, after reply data in cache database updates, data buffer storage device is by expired reply dataMake delete processing, realizes the spatial cache saved in data buffer storage device.
Detailed description of the invention
Fig. 1 is the data cache method flow diagram of network request in the embodiment of the present invention;
Fig. 2 is the data buffering system schematic diagram of network request in the embodiment of the present invention.
Specific embodiment
The present invention will be described in further detail below with reference to the embodiments of the drawings.
As shown in Figure 1, in the present embodiment network request data cache method, for realizing between client and serverData communication, the data cache method of the network request includes the following steps 1 to step 5:
Step 1, increase setting data buffer storage device, between a client and a server to be received by the data buffer storage deviceEach data request information transmitted on network;Wherein, each data request information, which has, characterizes the data request informationURL mark;Wherein, URL (Uniform Resource Locator, be abbreviated as URL) mark is also known as uniform resource locator;
Step 2, data buffer storage device the URL of each data request information mark is intercepted and cache each URL mark withAnd each URL identifies corresponding reply data into the data buffer storage device, establishes the caching for data cached solicited messageDatabase;Wherein, data cached in the cache database includes each data request information, corresponding each data request informationURL mark and corresponding each URL mark reply data;
In the present embodiment, specifically, in data buffer storage device, data cached in the cache database is included at leastAt least one of audio data, video data, image data and lteral data;
There is the caching subspace data cached for above-mentioned different type in the cache database;For example, caching is emptyBetween include that audio data caching subspace, video data caching subspace, image data caching subspace and lteral data are slowDeposit subspace;It is cached with audio data in audio data caching subspace, is cached with view in video data caching subspaceFrequency evidence is cached with image data in image data caching subspace, is cached with text in lteral data caching subspaceData;
Step 3, when client needs to send data request information to server, client sends data request information to numberAccording to buffer storage, judged by data buffer storage device:
When the corresponding URL mark of the current data solicited message transmitted by the client is located in cache database, data are slowThe data of the current URL mark of the correspondence that cryopreservation device has cached it are sent to client;Otherwise, data buffer storage device is by clientTransmitted data request information is transmitted to server;Wherein, data request information, URL identify and are directed to the request of dataIt is that correspondingly, i.e., each is identified therewith for only one URL of the data of data request information between the data of informationIt is corresponding;
Step 4, the data request information for the client that server is forwarded according to received data buffer storage is somebody's turn to do correspondingThe reply data of data request information URL mark is sent to data buffer storage device and is cached;
Step 5, data buffer storage device by after caching transmitted by server Lai correspondence reply data be sent to clientEnd, to realize that client receives its requested data.
Certainly, specific to the present embodiment, in server update reply data, server is first by the reply data of updateAnd the URL mark of the corresponding update reply data is sent to data buffer storage device;Then, data buffer storage device is cached numberIt is updated according to the URL mark of reply data and corresponding reply data in library;Finally, in client request data againWhen, data buffer storage device has updated it in cache database and the newest reply data of corresponding URL mark is sent to clientEnd, to realize the real-time update of data in data buffer storage device.
For data buffer storage device, after the reply data in cache database updates, data buffer storage device will be expiredReply data make delete processing, to save the spatial cache in the data buffer storage device.
Furthermore, it is contemplated that influence of the network speed to data interaction is increased network utilization to make full use of network,The data cache method of network request in the present embodiment further include: to above-mentioned audio data, video data, image data and textDigital data assigns different cache priority grades respectively;When needing to update the data in cache database, according to what is assignedCache priority grade is updated according to sequencing audio data, video data, image data and lteral data gentleIt deposits.
For example, assigning video data has level cache priority, assigning audio data has L2 cache priority, assignsImage data is given with three-level cache priority grade, assigning lteral data has level Four cache priority grade, then, it is slow needing to updateWhen data in deposit data library, according to the cache priority grade assigned for these four types of data, first video data is delayedIt deposits, then audio data is cached, most image data is cached, finally lteral data is cached.In this way, oneDenier network is preferable in a period of time most started, then with regard to preferentially caching the big video data of data volume, even if encounteringWhen network bad period, the small lteral data of data volume can also be cached slowly, will not be delayed for above-mentioned in this wayThe whole cache-time of each data.
Certainly, the data cache method of the network request can further include: data buffer storage device caches number to itIt is counted according to data each in library by the frequency that client accesses, to acquire the accessed frequency of each data;Data buffer storageDevice accordingly assigns different update priority to each data, by accessed frequency according to the size order of accessed frequencyThe big high priority data of rate is updated.For example, acquiring lteral data through statistics is accessed frequency maximum, image data is interviewedAsk that frequency is taken second place, audio data is accessed frequency third, and the accessed frequency of video data is minimum, it is possible to first to textData are updated, and are then updated to image data, then are updated to audio data, are finally carried out more to video dataNewly.
In addition, shown in Figure 2, the present embodiment also provides a kind of data buffering system of network request, including client,Server and data buffer storage device, data buffer storage device are communicated to connect with client and server respectively;Wherein:
Client gives data buffer storage device to send data request information;Wherein, data request information has characterization shouldThe URL of data request information is identified;
Data buffer storage device, the data to receive data request information that client sends, send to client are askedIt asks information to be judged, cached and forwarding server is sent to the reply data of client;Wherein, number client sentCarrying out judgement according to solicited message includes: that the corresponding URL mark of the current data solicited message transmitted by the client is located at caching numberWhen according in library, the reply data of the current URL mark of the correspondence that data buffer storage device has cached it is sent to client;Otherwise,Data request information transmitted by client is transmitted to server by data buffer storage device;
Server, to the data request information of the client forwarded according to data buffer storage device, by corresponding clientThe data request information URL mark reply data be sent to data buffer storage device processing.
In addition, can also further have cloud driving means in data buffer storage device;Wherein, cloud driving means,Newest reply data updated in data buffer storage device is sent to client, to realize data in data buffer storage deviceReal-time update.
Although the preferred embodiment of the present invention has been described in detail above, it is to be clearly understood that for this fieldTechnical staff for, the invention may be variously modified and varied.Done within the spirit and principles of the present inventionWhat modification, equivalent replacement, improvement etc., should all be included in the protection scope of the present invention.