Summary of the invention
In order to solve the above-mentioned problems in the prior art, the present invention provides at a kind of data based on multi-level bufferManage method and system, equipment and storage medium, effectively reduce using when data to data providing server and databaseCaused by pressure.
Embodiment according to the present invention provides a kind of data processing method based on multi-level buffer, the method packetIt includes: receiving data inquiry request;It is inquired step by step in multi-level buffer according to the inquiry request;Return to the number of targets inquiredAccording to, and the prime that the target data is backfilled to the caching being currently located is cached.
In certain embodiments of the present invention, the multi-level buffer includes at least the first caching, the second caching and thirdCaching;It is wherein, described that according to the inquiry request, inquiry includes: according to the inquiry request described step by step in multi-level bufferIt is inquired in first caching;Inquiry is exited if inquiring the target data in first caching, otherwise into described theTwo cachings are inquired;Inquiry is exited if inquiring the target data in second caching, otherwise into described theThree cachings are inquired;Inquiry is exited if inquiring the target data in third caching, inquiry is otherwise returned and losesThe information lost.
In certain embodiments of the present invention, the prime that the target data is backfilled to the caching being currently locatedCaching includes: to exit inquiry if inquiring the target data in first caching;If being looked into second cachingIt askes the target data and then exits inquiry, while the target data being backfilled in first caching;If describedThe target data is inquired in three cachings and then exits inquiry, while the target data is backfilled to first caching and instituteIt states in the second caching.
In certain embodiments of the present invention, the multi-level buffer includes local cache, distributed caching and database.
In certain embodiments of the present invention, the method also includes: receive data modification information, and be sent to describedMulti-level buffer;The multi-level buffer carries out information change according to the data modification information.
Embodiment according to the present invention provides a kind of data processing system based on multi-level buffer, which is characterized in thatThe system comprises communication module, inquiry request for receiving data;Enquiry module is used for according to the inquiry request moreIt is inquired step by step in grade caching;Module is backfilled, is backfilled to currently for returning to the target data inquired, and by the target dataThe prime of the caching at place caches.
In certain embodiments of the present invention, the multi-level buffer includes at least the first caching, the second caching and thirdCaching;Wherein, the enquiry module is used for: being inquired in first caching according to the inquiry request;If described firstThe target data is inquired in caching and then exits inquiry, is otherwise inquired into second caching;If described secondThe target data is inquired in caching and then exits inquiry, is otherwise inquired into third caching;If in the thirdThe target data is inquired in caching and then exits inquiry, otherwise returns to the information of inquiry failure.
In certain embodiments of the present invention, the backfill module is used for: if inquiring institute in first cachingIt states target data and then exits inquiry;Inquiry is exited if inquiring the target data in second caching, while by instituteTarget data is stated to be backfilled in first caching;It exits and looks into if inquiring the target data in third cachingIt askes, while the target data being backfilled in first caching and second caching.
In certain embodiments of the present invention, the multi-level buffer includes local cache, distributed caching and database.
In certain embodiments of the present invention, the system also includes: information to change module, changes for receiving dataInformation, and it is sent to the multi-level buffer;The multi-level buffer carries out information change according to the data modification information.
Meanwhile the present invention provides a kind of data processing equipment based on multi-level buffer, including memory and processor, institutesState memory for store one or more computer instruction;The processor is for calling one or more computer to refer toIt enables thereby executing any one of aforementioned data processing method.
The present invention also provides a kind of computer storage medium, it is stored with one or more computer program, described oneOr a plurality of computer program realizes any one of aforementioned data processing method when calling.
Data are carried out multistage storage by way of backfill by embodiments of the present invention, so that data under distributed environmentUser and provider can separate and respectively dispose, meanwhile, effectively reduce using when data to data providing serverWith pressure caused by database.
Specific embodiment
It is described in detail to various aspects of the present invention below in conjunction with the drawings and specific embodiments.Wherein, many institute's weeksModule, unit and its mutual connection, link, communication or the operation known are not shown or do not elaborate.Also, instituteFeature, framework or the function of description can combine in any way in one or more embodiments.Those skilled in the artMember is it should be appreciated that following various embodiments are served only for the protection scope for example, and is not intended to limit the present invention.May be used alsoTo be readily appreciated that, module or unit or processing mode in each embodiment described herein and shown in the drawings can by it is various notIt is combined and designs with configuration.
Firstly, being explained to noun of the present invention:
Application component: business function, operation flow, implementation pattern are close, for same category service product or serviceThe set of specific implementation;
Using data: the data that application component uses, data are divided into different type, every one kind data according to meaning differenceUnder can there are multiple key-value pairs to answer value;
Parameter provider: if some application component is responsible for safeguarding certain one kind using data, claiming the component is the dataProvider;
Parameter user: if the data that some application component uses data providing to provide, and not to such dataIt is safeguarded, then claiming the component is the user of the data.
Fig. 1 is a kind of flow diagram of data processing method of embodiment according to the present invention, in implementation of the inventionIn mode, referring to Fig.1, this method is specifically included:
100: receiving data inquiry request;
101: being inquired step by step in multi-level buffer according to the inquiry request;
102: returning to the target data inquired, and the prime that the target data is backfilled to the caching being currently located is delayedIt deposits.
In embodiments of the present invention, the multi-level buffer includes at least the first caching, the second caching and third caching.Above-mentioned processing 101 can be accomplished by the following way: be inquired in first caching according to the inquiry request;If describedThe target data is inquired in first caching and then exits inquiry, is otherwise inquired into second caching;If describedThe target data is inquired in second caching and then exits inquiry, is otherwise inquired into third caching;If describedThe target data is inquired in third caching and then exits inquiry, otherwise returns to the information of inquiry failure.
Likewise, in embodiments of the present invention, above-mentioned processing 102 can be accomplished by the following way: if describedThe target data, which is inquired, in first caching then exits inquiry;If inquiring the target data in second cachingInquiry is exited, while the target data being backfilled in first caching;If being inquired in third caching describedTarget data then exits inquiry, while the target data being backfilled in first caching and second caching.As a result,If inquiring same data again next time, will path in front find, and saved query time and improved efficiency.
By the above method, data are carried out multistage storage by way of backfill by embodiments of the present invention, make scoreData user and provider, which can separate, under cloth environment respectively disposes, meanwhile, it effectively reduces using logarithm when dataAccording to pressure caused by provider's server and database.
Hereinafter, will be explained in detail a kind of realization process of illustrative data processing method of the invention:
In this implementation, multi-level buffer includes local cache, distributed caching and database, and according to local slowIt deposits, the Query priority sequence arrangement of distributed caching and database.
After the inquiry request for receiving data user, by calling data acquisition API (ApplicationProgramming Interface, application programming interface) data query is carried out in multi-level buffer.
Specifically, call method is as follows:
Table 1
In this implementation, data acquisition API enters inquiry core logic, successively query path is called to be inquired,Specifically, query path is as follows:
1. local cache (memory), configuration path local, the key assignments number of such data local cache of configuration path dataAmount;
2. distributed caching, configuration path cache, configuration path data are the reality of such data access distributed cachingExample name;
3. outgoing call, configuration path remote, configuration path are the service codes of data providing publication;
4. persistence interface, configuration path db, configuration path are the title that the data persistence inquires class.
Wherein, the structural data found out in persistence interface from database is converted into json by json crossover toolCharacter string can be stored in local cache or distributed caching, can also return in outgoing call, and exchange agreement unified in this way is conducive toTransmitting of the data between each section.
Meanwhile the configuration that core logic depends on current application component is inquired, above 4 kinds of query path can according to needConfigure different query strategies.Query strategy example are as follows:
# data type=data providing mark, data validity interval, access path
Specifically, for example:
TransactionControlStrategy=false, 600, local (100), cache(strategyParam),remote(SRV001)
TransactionAccumulationControlStrategy=false, 600, local, cache(strategyParam),remote(SRV001)
CustomerBacklist=false, 7200, cache (customerparam)
CustomerRatingStrategy=true, 3600, cache (strategyParam), db (com.demo.CustomerRatingStrategyDao)
In above-mentioned example, first data type TransactionControlStrategy indicates the data for transaction controlPolicy data processed, this component are data users, and validity period 600 seconds, access path was to access local cache first, visited again realityExample name is the distributed caching of strategyParam, if all inquiry is finally visited less than (or the expired content found) beforeAsk that remote service code is the service acquisition of SRV001.
In this implementation, when the 2nd path query is accessed then, this in the 1st path of backfill update can be removedA data;When the 3rd path query is accessed to some data, this number in the 2nd and the 1st path can be backfilled respectivelyAccording to, and so on.In this way, if inquiring same data again next time, will path in front find, and saved inquiryTime improves efficiency.Distinguishingly, if current application component is not data providing, it only will be updated local cache.
Meanwhile in embodiments of the present invention, if there is data to be changed due to maintenance or other,Global unified change can be carried out in the following manner:
Data modification information is received, and is sent to the multi-level buffer;
The multi-level buffer carries out information change according to the data modification information.
Specifically, the provider of the data sends out data modification information when some data changes, by shouldData modification information is sent to other cachings, other cachings is made voluntarily to carry out data change.It preferably, can be for example, by broadcastMode transmit data modification information.Meanwhile may include version number information in the data modification information, each caching passes through comparisonVersion number information judges whether to need to carry out data change.
Fig. 2 is a kind of block diagram of data processing system 1 of embodiment according to the present invention, referring to Fig. 2, the data processing systemSystem 1 includes: communication module 11, for receiving data inquiry request;Enquiry module 12 is used for according to inquiry request in multi-level bufferIn inquire step by step;Backfill module 13, for returns to the target data inquired, and target data is backfilled to be currently located delayThe prime caching deposited.
In embodiments of the present invention, multi-level buffer includes at least the first caching, the second caching and third caching;ItsIn, enquiry module 12 is used for: being inquired in the first caching according to inquiry request;If inquiring target data in the first cachingInquiry is exited, otherwise enters the second caching and is inquired;Inquiry is exited if inquiring target data in the second caching, otherwiseIt is inquired into third caching;Inquiry is exited if inquiring target data in third caching, otherwise returns to inquiry failureInformation.
In embodiments of the present invention, backfill module 13 is used for: being moved back if inquiring target data in the first cachingIt inquires out;Inquiry is exited if inquiring target data in the second caching, while target data being backfilled in the first caching;Inquiry is exited if inquiring target data in third caching, while target data is backfilled to the first caching and the second cachingIn.
In embodiments of the present invention, data processing system 1 further include: information changes module 14, for receiving numberAccording to modification information, and sends at most grade and cache;Multi-level buffer carries out information change according to data modification information.
Optionally, embodiment of the present invention provides a kind of data processing equipment based on multi-level buffer, the data processingDevice includes memory, for storing one or more computer instruction;Processor, for calling one or more computer to refer toIt enables thereby executing data processing method provided by aforementioned embodiments of the present invention or implementation.Optionally, of the invention realIt applies in a kind of implementation of mode, data processing equipment can also include the input/output interface for carrying out data communication.For example, processing unit can be intelligent terminal, server etc..
Embodiment of the present invention additionally provides a kind of computer storage medium, is stored with one or more computer instruction,The data processing method provided for realizing aforementioned embodiments of the present invention or implementation when calling.For example, storage mediumIt may include hard disk, floppy disk, CD etc..
Although illustrating some embodiments herein, it, can be right under the premise of not departing from essence of the present inventionThese embodiments carry out various modifications, and all these deformations still fall within design of the invention, and fall into right of the present invention and wantProtection scope defined by asking.
Specific embodiment disclosed herein is only used for illustrating the present invention, to those skilled in the art,Obviously can carry out various modifications according to the teaching of this article, can using it is various it is equivalent by the way of implement the present invention, therefore, this hairBright specific embodiment disclosed above is only exemplary, and protection scope is not by construction or design disclosed hereinDetails is limited, unless being otherwise noted in the claims.Therefore, specific illustrative embodiment disclosed above can carry outVarious replacements, combination or modification, all deformations are both fallen in range disclosed herein.It is not specifically disclosed herein lackingAny element or in the case where lacking optional component disclosed herein, the disclosed number based on multi-level buffer exemplified hereIt can be still appropriately carried out according to processing method and system, device and storage medium.All numerical value and range disclosed above can alsoCentainly changed.Whenever disclosing the numberical range with lower and upper limit, any numerical value within the scope of this and any is fallen intoBy comprising range all specifically disclosed.Specifically, any range of numerical value disclosed herein can be regarded as enumeratingIt include any value and range in broader numerical.Equally, unless applicant is explicitly and clearly defined otherwise, powerTerm in benefit requirement has their clear, common meaning.
Through the above description of the embodiments, those skilled in the art can be understood that the present invention can be byThe mode of software combination hardware platform is realized.Based on this understanding, technical solution of the present invention makes tribute to background techniqueThat offers can be embodied in the form of software products in whole or in part, which can store is situated between in storageIn matter, such as ROM/RAM, magnetic disk, CD, including some instructions use is so that a computer equipment (can be individual calculusMachine, server or network equipment etc.) execute each embodiment of the present invention or embodiment certain parts method.
In addition, the quantity of the component in claims includes one or at least one, unless otherwise indicated.If this hairWord or term in bright in other documents usage or meaning there are inconsistent, then should be to be with defined in the present inventionIt is quasi-.