Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided a method embodiment of a light field camera interface module optimization method, it being noted that the steps illustrated in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and, although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
Example 1
FIG. 1 is a flow chart of a light field camera interface module optimization method according to an embodiment of the invention, as shown in FIG. 1, the method comprising the steps of:
step S102, acquiring light field camera interface data and working parameter values.
Specifically, in order to solve the technical problem that in the prior art, interface optimization is only performed by an optimization program to reset or modify interface data or interface configuration modes, and for high-speed interface configuration and real-time high-flow interface optimization, low efficiency and low precision negative effects are generated, the embodiment of the invention needs to obtain interface data and working parameter values in camera operation, wherein the working parameter values optionally include: data transmission flow and data compatibility breadth.
And step S104, inputting the working parameter value into a parameter comparison matrix to obtain optimization target data.
Optionally, the inputting the working parameter value into a parameter comparison matrix to obtain the optimization target data includes: generating the parameter comparison matrix according to the preset user requirement, wherein the parameter comparison matrix is a two-dimensional matrix consisting of two elements of original data and optimized data; and inputting the working parameter value into the parameter comparison matrix, and outputting the optimization target data.
Specifically, after the working parameter values are obtained in the embodiment of the invention, the working parameter values can be used as source data of the optimization target data, the working parameter values are matched and matched to the optimization target data through the parameter comparison matrix, and all the working parameter values are obtained through the method quickly so as to be worth corresponding to the optimization target.
And step S106, optimizing the light field camera interface data according to the optimized target data to obtain optimized interface data.
Optionally, after the optimizing the light field camera interface data according to the optimizing target data to obtain optimized interface data, the method further includes: and comparing and verifying the optimized interface data with the light field camera interface data to obtain a verification result.
And S108, inputting the optimized interface data into an interface module to obtain an optimized light field camera interface.
By the embodiment, the technical problems that the interface optimization content in the prior art is only to reset or modify the interface data or the interface configuration mode through an optimization program, and the high-speed interface configuration and the real-time high-flow interface optimization can have negative effects of low efficiency and low precision are solved.
Example two
FIG. 2 is a block diagram of a light field camera interface module optimizing apparatus according to an embodiment of the present invention, as shown in FIG. 2, the apparatus comprising:
the acquisition module 20 acquires light field camera interface data and operating parameter values.
Specifically, in order to solve the technical problem that in the prior art, interface optimization is only performed by an optimization program to reset or modify interface data or interface configuration modes, and for high-speed interface configuration and real-time high-flow interface optimization, low efficiency and low precision negative effects are generated, the embodiment of the invention needs to obtain interface data and working parameter values in camera operation, wherein the working parameter values optionally include: data transmission flow and data compatibility breadth.
The input module 22 inputs the working parameter values into a parameter comparison matrix to obtain optimization target data.
Optionally, the input module includes: the generation unit is used for generating the parameter comparison matrix according to the preset user requirement, wherein the parameter comparison matrix is a two-dimensional matrix consisting of two elements of original data and optimized data; and the input unit is used for inputting the working parameter value into the parameter comparison matrix and outputting the optimization target data.
Specifically, after the working parameter values are obtained in the embodiment of the invention, the working parameter values can be used as source data of the optimization target data, the working parameter values are matched and matched to the optimization target data through the parameter comparison matrix, and all the working parameter values are obtained through the method quickly so as to be worth corresponding to the optimization target.
And the optimization module 24 performs optimization processing on the light field camera interface data according to the optimization target data to obtain optimized interface data.
Optionally, the apparatus further includes: and the verification module is used for comparing and verifying the optimized interface data with the light field camera interface data to obtain a verification result.
The input module 26 then inputs the optimized interface data into the interface module to obtain an optimized light field camera interface.
By the embodiment, the technical problems that the interface optimization content in the prior art is only to reset or modify the interface data or the interface configuration mode through an optimization program, and the high-speed interface configuration and the real-time high-flow interface optimization can have negative effects of low efficiency and low precision are solved.
According to another aspect of the embodiment of the present invention, there is also provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the device where the nonvolatile storage medium is controlled to execute a light field camera interface module optimization method.
Specifically, the method comprises the following steps: acquiring light field camera interface data and working parameter values; inputting the working parameter value into a parameter comparison matrix to obtain optimization target data; optimizing the light field camera interface data according to the optimization target data to obtain optimized interface data; and inputting the optimized interface data into an interface module to obtain an optimized light field camera interface. Optionally, the operating parameter values include: data transmission flow and data compatibility breadth. Optionally, the inputting the working parameter value into a parameter comparison matrix to obtain the optimization target data includes: generating the parameter comparison matrix according to the preset user requirement, wherein the parameter comparison matrix is a two-dimensional matrix consisting of two elements of original data and optimized data; and inputting the working parameter value into the parameter comparison matrix, and outputting the optimization target data. Optionally, after the optimizing the light field camera interface data according to the optimizing target data to obtain optimized interface data, the method further includes: and comparing and verifying the optimized interface data with the light field camera interface data to obtain a verification result.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a light field camera interface module optimization method when executed.
Specifically, the method comprises the following steps: acquiring light field camera interface data and working parameter values; inputting the working parameter value into a parameter comparison matrix to obtain optimization target data; optimizing the light field camera interface data according to the optimization target data to obtain optimized interface data; and inputting the optimized interface data into an interface module to obtain an optimized light field camera interface. Optionally, the operating parameter values include: data transmission flow and data compatibility breadth. Optionally, the inputting the working parameter value into a parameter comparison matrix to obtain the optimization target data includes: generating the parameter comparison matrix according to the preset user requirement, wherein the parameter comparison matrix is a two-dimensional matrix consisting of two elements of original data and optimized data; and inputting the working parameter value into the parameter comparison matrix, and outputting the optimization target data. Optionally, after the optimizing the light field camera interface data according to the optimizing target data to obtain optimized interface data, the method further includes: and comparing and verifying the optimized interface data with the light field camera interface data to obtain a verification result.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, fig. 3 is a schematic hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include aninput device 30, a processor 31, anoutput device 32, amemory 33, and at least onecommunication bus 34. Thecommunication bus 34 is used to enable communication connections between the elements. Thememory 33 may comprise a high-speed RAM memory or may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented as, for example, a central processing unit (Central Processing Unit, abbreviated as CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to theinput device 30 and theoutput device 32 through wired or wireless connections.
Alternatively, theinput device 30 may include a variety of input devices, for example, may include at least one of a user-oriented user interface, a device-oriented device interface, a programmable interface of software, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware insertion interface (such as a USB interface, a serial port, etc.) for data transmission between devices; alternatively, the user-oriented user interface may be, for example, a user-oriented control key, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen, a touch pad, etc. having touch-sensitive functionality) for receiving user touch input by a user; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, for example, an input pin interface or an input interface of a chip, etc.; optionally, the transceiver may be a radio frequency transceiver chip, a baseband processing chip, a transceiver antenna, etc. with a communication function. An audio input device such as a microphone may receive voice data. Theoutput device 32 may include a display, audio, or the like.
In this embodiment, the processor of the terminal device may include functions for executing each module of the data processing apparatus in each device, and specific functions and technical effects may be referred to the above embodiments and are not described herein again.
Fig. 4 is a schematic hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of the implementation of fig. 3. As shown in fig. 4, the terminal device of the present embodiment includes aprocessor 41 and amemory 42.
Theprocessor 41 executes the computer program code stored in thememory 42 to implement the methods of the above-described embodiments.
Thememory 42 is configured to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, video, etc. Thememory 42 may include a random access memory (random access memory, simply referred to as RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, aprocessor 41 is provided in theprocessing assembly 40. The terminal device may further include: acommunication component 43, apower supply component 44, a multimedia component 45, anaudio component 46, an input/output interface 47 and/or asensor component 48. The components and the like specifically included in the terminal device are set according to actual requirements, which are not limited in this embodiment.
Theprocessing component 40 generally controls the overall operation of the terminal device. Theprocessing component 40 may include one ormore processors 41 to execute instructions to perform all or part of the steps of the methods described above. Further, theprocessing component 40 may include one or more modules that facilitate interactions between theprocessing component 40 and other components. For example,processing component 40 may include a multimedia module to facilitate interaction between multimedia component 45 andprocessing component 40.
Thepower supply assembly 44 provides power to the various components of the terminal device.Power supply components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal devices.
The multimedia component 45 comprises a display screen between the terminal device and the user providing an output interface. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
Theaudio component 46 is configured to output and/or input audio signals. For example, theaudio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a speech recognition mode. The received audio signals may be further stored in thememory 42 or transmitted via thecommunication component 43. In some embodiments,audio assembly 46 further includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processingassembly 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: volume button, start button and lock button.
Thesensor assembly 48 includes one or more sensors for providing status assessment of various aspects for the terminal device. For example, thesensor assembly 48 may detect the open/closed state of the terminal device, the relative positioning of the assembly, the presence or absence of user contact with the terminal device. Thesensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, thesensor assembly 48 may also include a camera or the like.
Thecommunication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot, where the SIM card slot is used to insert a SIM card, so that the terminal device may log into a GPRS network, and establish communication with a server through the internet.
From the above, it will be appreciated that thecommunication component 43, theaudio component 46, and the input/output interface 47, thesensor component 48 referred to in the embodiment of fig. 4 may be implemented as an input device in the embodiment of fig. 3.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.