Specific embodiment
Embodiment of the disclosure is described below in detail, the example of embodiment is shown in the accompanying drawings, wherein identical from beginning to endOr similar label indicates same or similar element or element with the same or similar functions.It is retouched below with reference to attached drawingThe embodiment stated is exemplary, it is intended to for explaining the disclosure, and should not be understood as the limitation to the disclosure.
Below with reference to the accompanying drawings user interface creating method, the user interface generating means of the embodiment of the present disclosure are described.
Fig. 1 is a kind of flow diagram of user interface creating method provided by the embodiment of the present disclosure.The present embodiment mentionsA kind of user interface creating method is supplied, executing subject is user interface generating means, and the executing subject is by hardware and/or softPart composition.
As shown in Figure 1, the user interface creating method the following steps are included:
S101, the image feature data for obtaining Freehandhand-drawing interface corresponding with user interface to be generated, and obtain preparatoryThe characteristic of determining initial markers.
In the present embodiment, user interface can be understood as the user interface that user is showed when application program operation, Freehandhand-drawingInterface can be understood as the sketch of the user interface of Freehandhand-drawing, the Freehandhand-drawing interface can be sketch that developer is drawn on paper intoWhat row was taken pictures, it is also possible to the electronic sketch that developer uses drawing software to draw, but be not limited thereto.
In one possible implementation, " image for obtaining Freehandhand-drawing interface corresponding with user interface to be generated is specialThe specific implementation of sign data " are as follows: receive Freehandhand-drawing interface corresponding with user interface to be generated, square is carried out to Freehandhand-drawing interfaceArray processing, obtains the image array of intelligent sketching;Image array is input in convolutional neural networks model, manual draw is obtainedThe image feature data of picture, wherein convolutional neural networks model is by each trained Freehandhand-drawing interface trained to convolutional neural networksIt arrives.
In the present embodiment, after receiving Freehandhand-drawing interface, progress matrixing processing in Freehandhand-drawing interface can be obtained correspondingImage array.Wherein, image array can be understood as the digital image data of intelligent sketching.Wherein, the row correspondence image of matrixHeight (unit is pixel), the width (unit is pixel) of matrix column correspondence image, the pixel of the element correspondence image of matrix, squareThe value of array element element is exactly the gray value of pixel.
In order to image array can more exact representation Freehandhand-drawing interface, before carrying out matrixing processing to Freehandhand-drawing interface,Carry out image preprocessing to Freehandhand-drawing interface, image preprocessing includes but is not limited to following pretreatment: noise eliminates, the two-value of imageChange etc..
In the present embodiment, after the image array for obtaining characterization Freehandhand-drawing interface, image array is input to preparatory trainingConvolutional neural networks model in, extract Freehandhand-drawing interface image feature data.
Since convolutional neural networks (Convolutional Neural Network, CNN) are excellent in terms of image procossingMore property, the present embodiment extracts the image feature data at Freehandhand-drawing interface using convolutional neural networks model, and then improves Freehandhand-drawing circleThe accuracy and extraction efficiency of the image feature data in face.
Convolutional neural networks model in the present embodiment obtains convolutional neural networks training by each trained Freehandhand-drawing interface, it may refer to the relevant technologies on how to training convolutional neural networks, details are not described herein.Wherein, training Freehandhand-drawing interface canTo be interpreted as the training sample of convolutional neural networks model, the training sample can be sketch that developer is drawn on paper intoWhat row was taken pictures, it is also possible to the electronic sketch that developer uses drawing software to draw, but be not limited thereto.It is understood that, the sum at training Freehandhand-drawing interface is more, and the precision of convolutional neural networks model is higher, and the sum at training Freehandhand-drawing interface is by openingOriginator is configured according to practical situation.
First simple introduce lower marks (Token) herein: label (Token) is Fundamentals of Compiling term, and label (Token) can be withIt is interpreted as constituting the minimum unit of source code, in the compilation phase, lexical analyzer reads in the character stream of composition source code and outputThe process of label (Token) is called marking (Tokenization), and the introduction about label (Token) can be detailed in related skillArt, details are not described herein.
By taking GUI code is the code of text label as an example, then entire GUI code can parse to mark as follows: <PAD>,<sTART>,<text>, x, y, width, height, content,</Text>,<eND>, wherein<pAD>indicate blank,The effect of placeholder is only served,<START>indicates that GUI code starts, and<text>is represented as text, and x, y are that the label is being appliedThe abscissa of coordinate system in the user interface of program, y are the ordinate of label coordinate system in the user interface of application program,Width, height are the width and height of the label, and content indicates content of text,</Text>representing the label terminates,<eND>Indicate that GUI code terminates.
In the present embodiment, using label (Token) sequence at first circulation neural network model output Freehandhand-drawing interface.ForThe flag sequence at starting first circulation neural network model output Freehandhand-drawing interface, needs to predefine initial markers.Initial markNote can be the initial markers of system default, be also possible to the initial markers of developer's setting, but be not limited thereto, for example,Initial markers are START.
In the present embodiment, after obtaining predetermined initial markers, into the characteristic for extracting initial markersProcess.
In one possible implementation, the specific implementation side of " characteristic for obtaining predetermined initial markers "Formula are as follows: predetermined initial markers are input in second circulation neural network model, obtain the characteristic of initial markers.
The present embodiment extracts the characteristic of label using second circulation neural network model trained in advance, to improve markThe precision of the characteristic of note.
In one possible implementation, the implementation of " training second circulation neural network model " are as follows: obtain eachCorresponding GUI code is carried out morphological analysis, obtains each trained Freehandhand-drawing circle by the corresponding GUI code in a trained Freehandhand-drawing interfaceEach training label in face;Training second circulation neural network is marked using each training, obtains second circulation neural network mouldType.
Specifically, the GUI code for having prepared each trained Freehandhand-drawing interface in advance, by carrying out morphology point to GUI codeAnalysis obtains each training label at each trained Freehandhand-drawing interface.The training label i.e. training sample of training second circulation neural networkThis, each training sample is input in second circulation neural network, is adjusted the parameter of second circulation neural network, is obtained secondRecognition with Recurrent Neural Network model.
S102, according to image feature data, the characteristic of initial markers and first circulation neural network trained in advanceModel determines the flag sequence at Freehandhand-drawing interface.
Since Recognition with Recurrent Neural Network (Recurrent Neural Network, RNN) can be reached when handling time series dataIt is the preferred network for handling time series data, the present embodiment is using first circulation neural network trained in advance to very high precisionModel exports label (Token) sequence at Freehandhand-drawing interface, and label (Token) sequence includes being ranked up by output sequencingN number of label, N are the integer greater than 1.
In the present embodiment, after extracting the characteristic of image feature data and initial markers at Freehandhand-drawing interface,The output of first circulation neural network model can be started to mark one by one, mark be ranked up i.e. by output sequencing one by oneFor label (Token) sequence.
By taking initial markers are START as an example, the characteristic of image feature data and START is input to first circulation mindThrough in network model, first circulation neural network model exports the 1st label;Then image feature data and the 1st are markedCharacteristic be input in first circulation neural network model, first circulation neural network model exports the 2nd label;SuccessivelyAnalogize, the characteristic of image feature data and previous output token is input in first circulation neural network model, is obtainedTo next output token, until obtaining n-th label;First by output by the 1st label, the 2nd label ... n-th labelIt is sequentially ranked up afterwards as label (Token) sequence.
S103, user interface is generated according to flag sequence.
Specifically, flag sequence can be understood as the tokenized GUI code for meeting morphological rule, parse flag sequenceAvailable corresponding GUI code.After obtaining GUI code, the GUI code is compiled in mobile applications Development Framework, it is rawAt corresponding user interface.
In order to compile the GUI code in mobile applications Development Framework, GUI code is embedded in mobile application journeyIn Page Template provided by sequence Development Framework, mobile applications Development Framework compiles the page mould for being embedded in GUI codePlate generates corresponding user interface.
By taking mobile applications Development Framework is Flutter as an example, Flutter is the mobile UI frame of Google, can be fastSpeed constructs the primary user interface of high quality on iOS and Android.
The following are the Page Templates of Flutter:
The user interface creating method that the disclosure provides, by obtaining Freehandhand-drawing interface corresponding with user interface to be generatedImage feature data, and obtain the characteristic of predetermined initial markers;According to image feature data, initial markersCharacteristic and first circulation neural network model trained in advance, determine the flag sequence at Freehandhand-drawing interface;According to label sequenceColumn-generation user interface.To according to the image feature data at Freehandhand-drawing interface, the characteristic of initial markers and training in advanceFirst circulation neural network model can obtain the flag sequence at the Freehandhand-drawing interface that can be converted into GUI code, realize developer onlyA Freehandhand-drawing interface, which need to be drawn, can automatically derive GUI code, complete the exploitation of user interface, be not necessarily to manual compiling interface generationCode, can greatly help the user interface of the light development and application program of developer, and improve the development efficiency of application program.
Further, in conjunction with reference Fig. 2, on the basis of embodiment shown in Fig. 1, flag sequence includes successively suitable by outputN number of label that sequence is ranked up, N are the integer greater than 1, the specific implementation of step S102 are as follows:
S1021, it is marked for i-th, image feature data and (i-1)-th characteristic marked is input to preparatory instructionIn experienced first circulation neural network model, i-th of label is exported.
Specifically, the N of i=1,2,3 ..., i.e. i 1 into N value, the 0th label be.In the present embodimentIn, the characteristic of image feature data and initial markers is input in first circulation neural network model, the 1st mark is exportedNote.Image feature data and the 1st characteristic marked are input in first circulation neural network model, export the 2ndLabel.And so on, by image feature data and the N-1 characteristic marked into first circulation neural network model,Export n-th label.
In the present embodiment, the characteristic of (i-1)-th label can be extracted using second circulation neural network model,To improve the precision of the characteristic of label.Specifically, (i-1)-th label is input in second circulation neural network model,Obtain the characteristic of (i-1)-th label.
S1022, judge to mark whether to terminate label for i-th, if it is not, step S1023 is executed, if so, executing stepS1024。
In the present embodiment, terminating label can be the initial markers of system default, be also possible to the first of developer's settingBegin label, but is not limited thereto, for example, terminating to be labeled as END.When the label of first circulation neural network model output isWhen beam marks, illustrate that the label of this output is the last one label in flag sequence, when first circulation neural network modelThe label of output is not when terminating label, to illustrate also to continue to output next label in flag sequence.The present embodiment is by sentencingDisconnected this output of first circulation neural network model marks whether that, to terminate label, realization obtains complete flag sequence, afterIt is continuous that complete GUI code can be obtained based on complete flag sequence, conducive to the exploitation of user interface.
S1023, the counting of i is added one.
For example, if the 1st label of output is not END, i is updated to 2;If the 2nd label of output is notWhen being END, then i is updated to 3;And so on, i is updated to i+1, until i-th is labeled as END.
S1024, the 1st label is ranked up according to output sequencing to i-th of label, obtains flag sequence.
For example, when i-th be labeled as END, by the 1st label to i-th mark etc. it is N number of label be ranked up, obtainTo flag sequence.
The user interface creating method that the disclosure provides, according to the image feature data at Freehandhand-drawing interface, the spy of initial markersSign data and first circulation neural network model trained in advance can obtain the mark at the Freehandhand-drawing interface that can be converted into GUI codeRemember sequence, realize that developer need to only draw a Freehandhand-drawing interface and can automatically derive GUI code, complete the exploitation of user interface,Without manual compiling GUI code, the user interface of the light development and application program of developer can be greatly helped, and improves and answersWith the development efficiency of program.
Further, in conjunction with reference Fig. 3, on the basis of embodiment shown in Fig. 1, the method can also include:
S104, each training sample is obtained.
S105, first circulation neural network is trained according to each training sample, obtains first circulation neural network model.
First circulation neural network model in the present embodiment is the flag sequence for obtaining Freehandhand-drawing interface, therefore,The each training sample obtained needs to include training Freehandhand-drawing interface and corresponding trained flag sequence, will training Freehandhand-drawing interface asThe input data of first circulation neural network, using training flag sequence as the output data of first circulation neural network, trainingFirst circulation neural network obtains first circulation neural network model.It is understood that the sum of training sample is more, firstThe precision of Recognition with Recurrent Neural Network model is higher.The sum of training sample is configured by developer according to practical situation.
In the actual process, the corresponding GUI code in trained Freehandhand-drawing interface has been prepared in advance, GUI code is subjected to morphologyAnalysis obtains including the M training flag sequence for training label, and M is the integer greater than 1, that is, training flag sequence is according to wordWhat the M training label that method analyzes sequencing sequence was formed.
Since the corresponding trained flag sequence of each training sample includes the M instruction according to the sequence of morphological analysis sequencingPractice label, M-1 training need to be carried out when using each training sample training first circulation neural network.
Further, the corresponding trained flag sequence of each training sample includes sorted M training label, and M is bigIn 1 integer, the specific implementation of step S105 are as follows: carry out M-1 to first circulation neural network according to each training sampleSecondary training.
Wherein, for the m times training, the M-1 of m=1,2,3 ..., i.e. m 1 into M-1 value, comprising the following steps:
The characteristic of S51, the characteristic for obtaining training Freehandhand-drawing interface and m-th of training label.
In the present embodiment, the characteristic at training Freehandhand-drawing interface can be obtained using convolutional neural networks model, using theTwo Recognition with Recurrent Neural Network models obtain the characteristic of m-th of training label, but are not limited thereto.
S52, first circulation mind is trained according to the characteristic at training Freehandhand-drawing interface and the characteristic of m-th of training labelThrough network, first circulation neural network model is obtained.
Specifically, using the characteristic at training Freehandhand-drawing interface as the input data of first circulation neural network, by m-thOutput data of the characteristic as first circulation neural network of training label, trained first circulation neural network obtain theOne Recognition with Recurrent Neural Network model.
S53, the desired output label by the m+1 training label as this training, by first circulation neural network mouldReality output label of the label of type output as this training.
S54, the gap for determining reality output label and desired output label.
In the present embodiment, in order to avoid the study of first circulation neural network is slowed down, promote first circulation neural networkTraining, can using cross entropy cost function (Cross-entropy cost function) be used to measure reality output markThe gap of note and desired output label, but be not limited thereto.More introductions about cross entropy cost function are detailed in related skillArt is not repeating herein.
S55, the parameter that first circulation neural network model is adjusted according to gap.
In the present embodiment, in order to optimize first circulation neural network model, back-propagation algorithm can be based on(Backpropagation algorithm, BP) adjusts the parameter of first circulation neural network model according to gap, but notAs limit.More introductions about back-propagation algorithm are detailed in the relevant technologies, are not repeating herein.
The user interface creating method that the disclosure provides, by training first circulation neural network model in advance, according to handDrawing the image feature data at interface, the characteristic of initial markers and first circulation neural network model trained in advance can obtainTo the flag sequence at the Freehandhand-drawing interface that can be converted into GUI code, realize that developer need to only draw a Freehandhand-drawing interface and can convertFor the flag sequence at the Freehandhand-drawing interface of GUI code, realize that developer need to only draw a Freehandhand-drawing interface and can automatically derive interfaceCode completes the exploitation of user interface, is not necessarily to manual compiling GUI code, can greatly help the light development and application of developerThe user interface of program, and improve the development efficiency of application program.
Fig. 4 is a kind of structural schematic diagram of user interface generating means provided by the embodiment of the present disclosure.The present embodiment mentionsA kind of user interface generating means are supplied, which is the executing subject of user interface creating method, and the executing subject is by hardwareAnd/or software composition.As shown in figure 4, the user interface generating means include: to obtain module 11, determining module 12, generation module13。
Module 11 is obtained, for obtaining the image feature data at Freehandhand-drawing corresponding with user interface to be generated interface, withAnd obtain the characteristic of predetermined initial markers;
Determining module 12, for being followed according to the characteristic of image feature data, initial markers and trained in advance firstRing neural network model determines the flag sequence at Freehandhand-drawing interface;
Generation module 13, for generating user interface according to flag sequence.
Further, flag sequence includes the N number of label being ranked up by output sequencing, and N is the integer greater than 1,Determining module 12 is specifically used for:
It is marked for i-th, wherein the N of i=1,2,3 ..., the 0th label is:
Image feature data and (i-1)-th characteristic marked are input to first circulation neural network trained in advanceIn model, i-th of label is exported;
Judge to mark whether to terminate label, if it is not, the counting of i is added one for i-th;
If so, being ranked up to the 1st label to i-th of label according to output sequencing, flag sequence is obtained.
Further, described device further includes the first training module;
First training module, for obtaining each training sample, wherein each training sample include training Freehandhand-drawing interface andCorresponding trained flag sequence;
According to each training sample training first circulation neural network, first circulation neural network model is obtained.
Further, the corresponding trained flag sequence of each training sample includes sorted M training label, and M is bigIn 1 integer, the first training module is specifically used for:
M-1 training is carried out to first circulation neural network according to each training sample;
Obtain training Freehandhand-drawing interface characteristic and m-th training label characteristic, wherein m=1,2,3……M-1;
According to the characteristic training first circulation nerve net of the characteristic at training Freehandhand-drawing interface and m-th of training labelNetwork obtains first circulation neural network model;
Desired output label by the m+1 training label as this training, first circulation neural network model is defeatedReality output label of the label out as this training;
Determine the gap of reality output label and desired output label;
The parameter of first circulation neural network model is adjusted according to gap.
Further, obtaining module 11 includes first acquisition unit 111;
First acquisition unit 111, for receiving Freehandhand-drawing corresponding with user interface to be generated interface, to Freehandhand-drawing interface intoRow matrixization processing, obtains the image array of intelligent sketching;
Image array is input in convolutional neural networks model, the image feature data of hand-drawing image is obtained, wherein volumeProduct neural network model obtains convolutional neural networks training by each trained Freehandhand-drawing interface.
Further, obtaining module 11 includes second acquisition unit 112;
Second acquisition unit 112, for predetermined initial markers to be input in second circulation neural network model,Obtain the characteristic of initial markers.
Further, described device further includes the second training unit;
Second training unit, for obtaining the corresponding GUI code in each trained Freehandhand-drawing interface, by corresponding GUI codeMorphological analysis is carried out, each training label at each trained Freehandhand-drawing interface is obtained;
Training second circulation neural network is marked using each training, obtains second circulation neural network model.
It should be noted that the aforementioned explanation to user interface creating method embodiment is also applied for the embodimentUser interface generating means, details are not described herein again.
The user interface generating means that the disclosure provides, by obtaining Freehandhand-drawing interface corresponding with user interface to be generatedImage feature data, and obtain the characteristic of predetermined initial markers;According to image feature data, initial markersCharacteristic and first circulation neural network model trained in advance, determine the flag sequence at Freehandhand-drawing interface;According to label sequenceColumn-generation user interface.To according to the image feature data at Freehandhand-drawing interface, the characteristic of initial markers and training in advanceFirst circulation neural network model can obtain the flag sequence at the Freehandhand-drawing interface that can be converted into GUI code, realize developer onlyA Freehandhand-drawing interface, which need to be drawn, can automatically derive GUI code, complete the exploitation of user interface, be not necessarily to manual compiling interface generationCode, can greatly help the user interface of the light development and application program of developer, and improve the development efficiency of application program.
Fig. 5 is the structural schematic diagram for a kind of electronic equipment that the embodiment of the present disclosure provides.The electronic equipment includes:
Memory 1001, processor 1002 and it is stored in the calculating that can be run on memory 1001 and on processor 1002Machine program.
Processor 1002 realizes the user interface creating method provided in above-described embodiment when executing described program.
Further, electronic equipment further include:
Communication interface 1003, for the communication between memory 1001 and processor 1002.
Memory 1001, for storing the computer program that can be run on processor 1002.
Memory 1001 may include high speed RAM memory, it is also possible to further include nonvolatile memory (non-Volatile memory), a for example, at least magnetic disk storage.
Processor 1002 realizes user interface creating method described in above-described embodiment when for executing described program.
If memory 1001, processor 1002 and the independent realization of communication interface 1003, communication interface 1003, memory1001 and processor 1002 can be connected with each other by bus and complete mutual communication.The bus can be industrial standardArchitecture (Industry Standard Architecture, referred to as ISA) bus, external equipment interconnection(Peripheral Component, referred to as PCI) bus or extended industry-standard architecture (Extended IndustryStandard Architecture, referred to as EISA) bus etc..The bus can be divided into address bus, data/address bus, controlBus processed etc..Only to be indicated with a thick line in Fig. 5, it is not intended that an only bus or a type of convenient for indicatingBus.
Optionally, in specific implementation, if memory 1001, processor 1002 and communication interface 1003, are integrated in oneIt is realized on block chip, then memory 1001, processor 1002 and communication interface 1003 can be completed mutual by internal interfaceCommunication.
Processor 1002 may be a central processing unit (Central Processing Unit, referred to as CPU), orPerson is specific integrated circuit (Application Specific Integrated Circuit, referred to as ASIC) or quiltIt is configured to implement one or more integrated circuits of the embodiment of the present disclosure.
The disclosure also provides a kind of computer readable storage medium, is stored thereon with computer program, and the program is processedDevice realizes user interface creating method as described above when executing.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically showThe description of example " or " some examples " etc. means specific features, structure, material or spy described in conjunction with this embodiment or examplePoint is contained at least one embodiment or example of the disclosure.In the present specification, schematic expression of the above terms are notIt must be directed to identical embodiment or example.Moreover, particular features, structures, materials, or characteristics described can be in officeIt can be combined in any suitable manner in one or more embodiment or examples.In addition, without conflicting with each other, the skill of this fieldArt personnel can tie the feature of different embodiments or examples described in this specification and different embodiments or examplesIt closes and combines.
In addition, term " first ", " second " are used for descriptive purposes only and cannot be understood as indicating or suggesting relative importanceOr implicitly indicate the quantity of indicated technical characteristic.Define " first " as a result, the feature of " second " can be expressed orImplicitly include at least one this feature.In the description of the disclosure, the meaning of " plurality " is at least two, such as two, threeIt is a etc., unless otherwise specifically defined.
Any process described otherwise above or method description are construed as in flow chart or herein, and expression includesIt is one or more for realizing custom logic function or process the step of executable instruction code module, segment or portionPoint, and the range of the preferred embodiment of the disclosure includes other realization, wherein can not press shown or discussed suitableSequence, including according to related function by it is basic simultaneously in the way of or in the opposite order, Lai Zhihang function, this should be by the disclosureEmbodiment person of ordinary skill in the field understood.
Expression or logic and/or step described otherwise above herein in flow charts, for example, being considered useIn the order list for the executable instruction for realizing logic function, may be embodied in any computer-readable medium, forInstruction execution system, device or equipment (such as computer based system, including the system of processor or other can be held from instructionThe instruction fetch of row system, device or equipment and the system executed instruction) it uses, or combine these instruction execution systems, device or setIt is standby and use.For the purpose of this specification, " computer-readable medium ", which can be, any may include, stores, communicates, propagates or passDefeated program is for instruction execution system, device or equipment or the dress used in conjunction with these instruction execution systems, device or equipmentIt sets.The more specific example (non-exhaustive list) of computer-readable medium include the following: there is the electricity of one or more wiringsInterconnecting piece (electronic device), portable computer diskette box (magnetic device), random access memory (RAM), read-only memory(ROM), erasable edit read-only storage (EPROM or flash memory), fiber device and portable optic disk is read-only depositsReservoir (CDROM).In addition, computer-readable medium can even is that the paper that can print described program on it or other are suitableMedium, because can then be edited, be interpreted or when necessary with it for example by carrying out optical scanner to paper or other mediaHis suitable method is handled electronically to obtain described program, is then stored in computer storage.
It should be appreciated that each section of the disclosure can be realized with hardware, software, firmware or their combination.Above-mentionedIn embodiment, software that multiple steps or method can be executed in memory and by suitable instruction execution system with storageOr firmware is realized.Such as, if realized with hardware in another embodiment, following skill well known in the art can be usedAny one of art or their combination are realized: have for data-signal is realized the logic gates of logic function fromLogic circuit is dissipated, the specific integrated circuit with suitable combinational logic gate circuit, programmable gate array (PGA), scene can compileJourney gate array (FPGA) etc..
Those skilled in the art are understood that realize all or part of step that above-described embodiment method carriesIt suddenly is that relevant hardware can be instructed to complete by program, the program can store in a kind of computer-readable storage mediumIn matter, which when being executed, includes the steps that one or a combination set of embodiment of the method.
It, can also be in addition, can integrate in a processing module in each functional unit in each embodiment of the disclosureIt is that each unit physically exists alone, can also be integrated in two or more units in a module.Above-mentioned integrated mouldBlock both can take the form of hardware realization, can also be realized in the form of software function module.The integrated module is such asFruit is realized and when sold or used as an independent product in the form of software function module, also can store in a computerIn read/write memory medium.
Storage medium mentioned above can be read-only memory, disk or CD etc..Although having been shown and retouching aboveEmbodiment of the disclosure is stated, it is to be understood that above-described embodiment is exemplary, and should not be understood as the limit to the disclosureSystem, those skilled in the art can be changed above-described embodiment, modify, replace and become within the scope of this disclosureType.