Disclosure of Invention
The invention aims to provide an optical information collector and a method for reducing power consumption.
In order to achieve the above purpose, the present application adopts the following technical means:
the application provides an optical information collector, its characterized in that includes: the image sensor is used for collecting image data of optical information; the decoding unit is used for decoding the image data according to a preset decoding algorithm; and the central processing unit is triggered to control the image sensor to acquire the image data and control the decoding unit to decode the image data, wherein once triggered, the central processing unit sends out an instruction to discard N frames of image data with a specific frame number, wherein the N frames of image data with the specific frame number are the image data acquired by the last trigger and remained in the optical information acquisition unit.
Alternatively, the N frames of image data of a certain frame number include image data remaining in a storage area of the image sensor.
Optionally, the image processing device comprises an image signal processor, and is used for receiving the image data acquired by the image sensor and transmitting the image data to the decoding unit, wherein the N frames of image data with a specific frame number comprise the image data remained in the image signal processor.
Optionally, discarding the N frames of image data of the particular frame number includes: the decoding unit does not receive the N-frame image data of the specific frame number, or the decoding unit does not decode the N-frame image data of the specific frame number, or the decoding unit does not output or display the decoding information of the N-frame image data of the specific frame number.
Alternatively, the decoding unit starts decoding from the n+1st frame image data.
The application provides an optical information acquisition method, which is characterized by comprising the following steps: the central processing unit is used for controlling the image sensor to acquire and output image data through triggering; the central processing unit receives the image data and sends out an instruction to discard N frames of image data with a specific frame number, wherein the N frames of image data with the specific frame number are the image data which are acquired and remained through triggering at the last time; the decoding unit decodes the image data.
Alternatively, the N frames of image data of a certain frame number include image data remaining in a storage area of the image sensor.
Optionally, receiving, by an image signal processor, the image data acquired by the image sensor, the image signal processor further transmitting the image data to the decoding unit; the N frames of image data of a certain frame number include image data remaining in the image signal processor.
Optionally, discarding the N frames of image data of the particular frame number includes: the decoding unit does not receive the N-frame image data of the specific frame number, or the decoding unit does not decode the N-frame image data of the specific frame number, or the decoding unit does not output or display the decoding information of the N-frame image data of the specific frame number.
Alternatively, the decoding unit starts decoding from the n+1st frame image data.
The application provides an optical information collector, its characterized in that includes: the image sensor is used for collecting image data of optical information; a decoding unit to receive and decode image data; and the central processing unit is used for controlling the image sensor to acquire image data and controlling the decoding unit to decode the image data, wherein the central processing unit controls the image sensor to acquire and output image data with a fixed frame number in a fixed frame mode through triggering, controls the decoding unit to decode the image data, and stops decoding the remaining image data in the image data with the fixed frame number when one frame of image data is successfully decoded.
Optionally, the fixed frame mode includes: when the decoding unit is successful and the image sensor does not collect the image data with the fixed frame number, the image sensor continuously collects the image data with the fixed frame number and outputs the image data with the fixed frame number.
Optionally, the fixed frame mode includes: the CPU controls the decoding unit to sequentially receive and decode the image data with the fixed frame number, and controls the image sensor to acquire the image data with the fixed frame number again when or before the last frame of the image data with the fixed frame number is not successfully decoded.
Optionally, the image data acquired by the image sensor is not included or optimized by the image signal processor.
Optionally, the image sensor is configured to acquire image data in the following order: image data is acquired in a fixed frame mode for a preset number of times, and image data is continuously acquired in a digital stream mode.
The application provides an optical information acquisition method, which is characterized by comprising the following steps: the central processing unit is used for triggering and controlling the image sensor to acquire and output image data with a fixed frame number in a fixed frame mode; the decoding unit receives and decodes the image data, and stops decoding remaining image data among the image data of the fixed frame number when one of the frame image data is successfully decoded.
Optionally, when the decoding unit decodes successfully and the image sensor does not collect the image data of the fixed frame number, the image sensor continues to collect the image data of the fixed frame number and outputs the image data of the fixed frame number.
Optionally, the central processing unit controls the decoding unit to sequentially receive and decode the image data with the fixed frame number, and controls the image sensor to acquire the image data with the fixed frame number again when or before the last frame of the image data with the fixed frame number is not successfully decoded.
Optionally, the image data acquired by the image sensor is not included or optimized by the image signal processor.
Optionally, the image sensor is configured to acquire image data in the following order: image data is acquired in a fixed frame mode for a preset number of times, and image data is continuously acquired in a digital stream mode.
The application provides an optical information collector, its characterized in that includes: the image sensor is used for collecting image data of optical information; a memory, which is preset with one or more decoding algorithms; a decoding unit to receive and decode image data; and the central processing unit is used for controlling the image sensor to continuously acquire image data in a digital stream mode, controlling the decoding unit to sequentially decode the image data, controlling the image sensor to stop continuously acquiring the image data in the digital stream mode once the decoding unit decodes successfully or the decoding is overtime, and controlling the image sensor to continuously acquire and output the image data with fixed frame numbers.
Optionally, the optical information collector does not have an image signal processor or performs optimization processing on the image data through the image signal processor.
Alternatively, the image sensor outputs RAW format image data, and the decoding unit acquires gradation image data based on the RAW format image data and decodes based on the gradation image data.
Optionally, the fixed frame number of image data is one frame or two frames.
Optionally, the image data collected by the image sensor is directly transmitted to the decoding unit for decoding.
The application provides an optical information acquisition method, which is characterized by comprising the following steps: the central processing unit controls the image sensor to continuously collect and output image data in a digital stream mode; a decoding unit receives and decodes the image data, and controls the image sensor to stop acquiring the image data in a digital stream mode once the decoding unit decodes successfully; and controlling the image sensor to continuously collect and output image data with fixed frame number.
Optionally, the optical information collector does not have an image signal processor or performs optimization processing on the image data through the image signal processor.
Alternatively, the image sensor outputs RAW format image data, and the decoding unit acquires gradation image data based on the RAW format image data and decodes based on the gradation image data.
Optionally, the fixed frame number of image data is one frame or two frames.
Optionally, the image data collected by the image sensor is directly transmitted to the decoding unit for decoding.
Detailed Description
For a better understanding of the objects, structures, features, and effects of the present application, reference should be made to the drawings and to the detailed description.
Referring to FIG. 1, a simplified block diagram of an implementation of anoptical information collector 100 of one embodiment is shown. As described in further detail below, theoptical information collector 100 may be configured to collect one or more types of optical information, such as one-dimensional codes, two-dimensional codes, OCR graphics and text, ultraviolet anti-counterfeiting codes, infrared anti-counterfeiting codes, and the like.
Theoptical information collector 100 may include at least one camera 1, the camera 1 may include a combination of an optical system 2 (lens) for capturing light and an image sensor 3 (sensor) for photoelectrically converting the light captured by the optical system 2, the optical system 2 may include one or more mirrors, prisms, lenses, or a combination thereof, theimage sensor 3 may also be one or more, oneimage sensor 3 corresponds to one/one optical system 2, or a plurality ofimage sensors 3 may share the same optical system 2, or a plurality of optical systems 2 may share thesame image sensor 3. Theimage sensor 3 may be a CCD or CMOS or other type ofimage sensor 3, theimage sensor 3 being configured to convert an optical signal into an electrical signal, thereby outputting a digital signal of image data.
Theoptical information collector 100 may comprise one or morelight compensating lamps 4, thelight compensating lamps 4 being arranged to illuminate the optical information when the camera 1 collects image data. Of course, thelight supplementing lamp 4 may not be used for supplementing light under appropriate ambient lighting conditions, or theoptical information collector 100 may not have thelight supplementing lamp 4. The light-supplementing mode of the light-supplementinglamp 4 can have various forms: for example, thelight supplementing lamp 4 continuously supplements light when the camera 1 collects optical information; or thelight filling lamp 4 may be light filling in synchronization with the exposure time of theimage sensor 3 of the camera 1, wherein, chinese patent CN201810098421.0 discloses a technical scheme of light filling in synchronization with the exposure time of thelight filling lamp 4 and theimage sensor 3, the entire contents of which are incorporated herein by reference; thelight filling lamp 4 may also be a pulsed light filling, the pulse time of which overlaps a part of the exposure time of theimage sensor 3.
Theoptical information collector 100 may also include acentral processor 5 for executing various instructions.
Theoptical information collector 100 may further comprise a separate orintegrated memory 6, one or more decoding algorithms are preset in thememory 6 according to need, and thememory 6 may further store other programs or instructions. Thememory 6 may include one or more non-transitory storage media, such as volatile and/ornon-volatile memory 6, which may be fixed or removable, for example. In particular, thememory 6 may be configured to store information, data, applications, instructions or the like for enabling the processing module to carry out various functions in accordance with example embodiments of the present invention. For example, thememory 6 may be configured to buffer input data for processing by thecentral processor 5. Additionally or alternatively, thememory 6 may be configured to store instructions that are executed by thecentral processor 5. Thememory 6 may be considered to be themain memory 6 and be included in a volatile storage device, e.g. in RAM or other form, which retains its contents only during operation, and/or thememory 6 may be included in a non-volatile storage device, such as ROM, EPROM, EEPROM, FLASH or other type of storage device, which retains thememory 6 contents independent of the power state of the processing module. Thememory 6 may also be included in a secondary storage device, such as anexternal disk memory 6, that stores large amounts of data. In some embodiments,disk storage 6 may communicate withcentral processor 5 via a data bus or other routing means using input/output means. Thesecondary memory 6 may comprise a hard disk, a compact disk, a DVD, a memory card, or any other type of mass storage type known to those skilled in the art. Thememory 6 may store one or more of a variety of optical information gathering, transmission, processing, and decoding processes or methods that will be described below.
Theoptical information collector 100 may further include an image signal processor 7 (Image Signal Processor, abbreviated as ISP), where theimage signal processor 7 is configured to perform optimization processing on the image data collected by the camera 1, where the optimization processing includes one or more of linear correction, noise removal, dead pixel repair, color interpolation, white balance correction, exposure correction, and so on, so as to optimize the quality of the image data. For optical information that does not require color recognition, some or all of the foregoing optimization processes, such as color interpolation, are not necessary. Theimage signal processor 7 may process one frame of image data at a time by a single core single thread, or theimage signal processor 7 may process a plurality of frames of image data simultaneously by a plurality of cores multi-thread. Alternatively, theoptical information collector 100 may not have theimage signal processor 7, or the image data may not be optimally processed by theimage signal processor 7.
Theoptical information collector 100 may further include adecoding unit 8, where thedecoding unit 8 is configured to decode the image data collected by the camera 1 according to a preset decoding algorithm, so as to identify optical information, such as identifying encoded information of a one-dimensional code or a two-dimensional code, or identifying OCR graphics context, or identifying encoded information of various ultraviolet/infrared anti-pseudo codes, and so on. Thedecoding unit 8 may decode one frame of image data at a time by a single core single thread, or thedecoding unit 8 may decode multiple frames of image data simultaneously by multiple cores and multiple threads.
Alternatively, some or all of the functional modules of theimage signal processor 7 may be integrated with thecentral processor 5, such as chinese patent CN201811115589.4, which discloses acentral processor 5 with integratedimage signal processor 7, the entire contents of which are incorporated herein by reference; alternatively, some or all of the functional blocks of theimage signal processor 7 may be integrated with theimage sensor 3; alternatively, thedecoding unit 8 may be integrated with thecentral processor 5; alternatively, thememory 6 may be integrated with thecentral processing unit 5. In the following embodiments, when the image data is optimally processed by theimage signal processor 7, theimage signal processor 7 and thedecoding unit 8 are preferably integrated with thecentral processor 5, so that costs can be saved; of course, theimage signal processor 7 and thedecoding unit 8 may not be integrated with thecentral processor 5.
Fig. 2 and 3 show schematic views of a handheld terminal as a specific embodiment of theoptical information collector 100, the handheld terminal comprising ahousing 9, adisplay 11 andbuttons 12. The front end of thehousing 9 is provided with ascanning window 10, the camera 1 is accommodated in thehousing 9, and optical information can be collected through thescanning window 10. Alternatively, theoptical information collector 100 may not have thedisplay screen 11, but may output information to aseparate display screen 11 for display. Alternatively, theoptical information collector 100 may be a stationary, desktop or other form of terminal, and theoptical information collector 100 may be integrated with other devices as part of the other devices.
Thecentral processing unit 5 issues a trigger instruction via an external trigger, which may be a trigger generated by a user pressing aspecific button 12 or touching a specific area of thedisplay screen 11 or by a specific gesture by a user operating theoptical information collector 100. Once thecentral processing unit 5 is triggered externally, a trigger instruction is sent out according to a preset algorithm, so that theimage sensor 3 is triggered to acquire image data.
The image data collected by theimage sensor 3 may be optimized by theimage signal processor 7, and then output to thedecoding unit 8 for decoding. Referring to the schematic diagram of the bar code collection by theoptical information collector 100 shown in the block diagram of fig. 4, when the user presses thebutton 12 to trigger thelight supplementing lamp 4 to supplement light and theimage sensor 3 to collect image data, theimage signal processor 7 may sequentially receive and optimally process the image data collected by theimage sensor 3 through the MIPI interface (mobile industry processor interface Mobile Industry Processor Interface, abbreviated as MIPI), thedecoding unit 8 decodes the image data transmitted after the optimization process of theimage signal processor 7, when one frame of image data is successfully decoded, thedecoding unit 8 will stop decoding and notify thecpu 5 that the decoding is successful, and thecpu 5 issues an instruction to control theimage sensor 3 to stop collecting the image data.
Theimage sensor 3 may continuously acquire image data through a digital stream mode, i.e., a digital stream mode in which theimage sensor 3 continuously acquires image data for a preset time according to a preset algorithm, thedecoding unit 8 decodes the continuously acquired image data by single line Cheng Yici or simultaneously decodes the continuously acquired image data by multiple threads, and when the decoding is successful or the decoding is timed out, theimage sensor 3 is controlled to stop acquiring the image data, and thedecoding unit 8 is controlled to stop decoding. For example, the preset time is five seconds, which represents that theimage sensor 3 continuously collects image data in five seconds, and if the image data collected by theimage sensor 3 in five seconds is not successfully decoded, the decoding is overtime; if one of the frames of image data is successfully decoded, thecentral processor 5 will control theimage sensor 3 to stop acquiring image data and control thedecoding unit 8 to stop decoding even if the time has not reached five seconds.
Fig. 5 shows a timing diagram 200 of the optical information collector 100 collecting optical information in a digital stream mode according to an embodiment, the timing diagram 200 shows a trigger signal 201 triggered by the outside, a light filling timing 202 of the light filling lamp 4, an image data collecting timing 203 of the image sensor 3 continuously collecting image data, and a decoding timing 204 of the decoding unit 8, wherein: the triggering signal 201 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to stop collecting image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 202 and turns off light supplementing at a low level; the image data acquisition time sequence 203 of the image sensor 3 is synchronous with the light supplementing time sequence 202, and the image sensor 3 outputs the image data at a high level exposure and a low level of the image data acquisition time sequence 203; in fig. 5, the dashed arrow represents that the first frame image data is output to the decoding unit 8 for decoding, the decoding unit 8 receives the first frame image data at the time point a, successfully decodes the first frame image data at the time point b, and feeds back the successfully decoded information to the central processing unit 5, and the central processing unit 5 controls the image sensor 3 to stop collecting the image data and controls the light filling lamp 4 to stop filling light at the time point c. The rising edge of the high level of thetrigger signal 201 is slightly earlier than the rising edge of the high level of the imagedata acquisition timing 203 due to the signal delay, and the falling edge of the high level of the trigger signal is slightly earlier than the point in time c at which theimage sensor 3 ends acquiring image data. It should be noted that, when the ambient light is sufficient, the light supplement is not necessary.
As can be seen from thetiming chart 200, when thedecoding unit 8 decodes image data, theimage sensor 3 is acquiring new image data, when thedecoding unit 8 decodes first frame image data successfully, theimage sensor 3 already acquires seven frames of image data, wherein the second to seventh frames of image data are not transmitted to thedecoding unit 8, but are stored (remained) in the storage area 13 (buffer or PN junction) of theimage sensor 3 or theregister 14 corresponding to theimage signal processor 7, and according to the first-in first-out principle, the post-acquired image data will cover the previously acquired image data, then the seventh frame of image data will be stored in theregister 14 of theimage signal processor 7, and the sixth frame of image data will be stored in the storage area 13 of theimage sensor 3, and the second to fifth frames of image data will be covered and cleared.
When theoptical information collector 100 collects new optical information again by triggering, thedecoding unit 8 will first receive and decode the image data last left in the memory area 13 of theimage sensor 3 or theregister 14 of theimage signal processor 7, which will tend to cause decoding errors, because the last left image data is not the image data of the new optical information.
The above problem can be solved by the following method, avoiding decoding errors.
An alternative approach is shown in the timing diagram 300 of FIG. 6, where N.gtoreq.1, by discarding N frames of image data of a particular frame number, and starting decoding from the (n+1) th frame of image data, to avoid decoding errors. The timing diagram 300 shows a trigger signal 301 for external triggering, a light-filling timing 302 for the light-filling lamp 4, an image data acquisition timing 303 for the image sensor 3 to continuously acquire image data, and a decoding timing 304 for the decoding unit 8, wherein: the triggering signal 301 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 302 and turns off light supplementing at a low level; the image data acquisition time sequence 303 of the image sensor 3 is synchronous with the light supplementing time sequence 302, and the image sensor 3 exposes at a high level of the image data acquisition time sequence 303 and outputs image data at a low level; since one frame of image data remains in each of the image sensor 3 and the image signal processor 7, the specific frame number N is discarded as two frames, the first two frames of image data in the image data acquisition timing 303 are not transmitted to the decoding unit 8, the dashed arrow in fig. 6 represents that the third frame of image data is output to the decoding unit 8 for decoding, the decoding unit 8 receives and decodes the third frame of image data at the time point d, successfully decodes the third frame of image data at the time point e, and feeds back the successfully decoded information to the central processor 5, and controls the image sensor 3 to stop acquiring the image data and the light supplementing lamp 4 to stop light supplementing at the time point f due to the delay of signals. It should be noted that, when the ambient light is sufficient, the light supplement is not necessary. As can be seen from the imagedata acquisition timing 302, theimage sensor 3 has already acquired eight frames of image data at this time, and the eighth frame of image data will remain in theregister 14 of theimage signal processor 7, and the seventh frame of image data will remain in the storage area 13 of theimage sensor 3. When theoptical information collector 100 collects new optical information again through triggering, theoptical information collector 100 discards the two frames of image data remained in theimage sensor 3 and theimage signal processor 7 again, and starts decoding output from the third frame of image data, so as to avoid decoding errors.
It is to be understood that the number of frames N of the image data remaining at the time of the last acquisition of the image data may be equal to or larger than the number of frames, and is not limited to two, and the number of frames N of the image data remaining in theimage sensor 3 and theimage signal processor 7 may be equal to or larger than two. Discarding the surviving image data may include thedecoding unit 8 not receiving the surviving image data, or thedecoding unit 8 not decoding the surviving image data although receiving the surviving image data, or thedecoding unit 8 decoding the surviving image data but the decoded information is not output or displayed on thedisplay screen 11, the information output and displayed on thedisplay screen 11 being the decoded information of the new optical information. For example, it is known that the memory area 13 of theimage sensor 3 and theregister 14 of theimage signal processor 7 store the image data acquired by the last trigger, then when the next trigger acquires new optical information, the first two frames of image data are discarded, and the third frame and the subsequent image data are used as the image data of the new optical information, so as to decode the third frame and the subsequent image data until the decoding is successful or the decoding is overtime.
In one embodiment, the optical information collector 100 may not perform the optimization processing on the image data by using the image signal processor 7, the image signal processor 7 only receives the image data in the RAW format transmitted by the image sensor 3, and then transmits the image data not subjected to the optimization processing to the decoding unit 8 to decode, the decoding unit 8 directly receives the gray image data (only takes the brightness signal of the RAW image data), so as to facilitate the binary decoding of the image data, and the image signal processor 7 only serves as a simple data transmission channel, so that no residual image data exists in the register 14 of the image signal processor 7; or the optical information collector 100 does not include the image signal processor 7, the RAW image data collected by the image sensor 3 is directly transmitted to the decoding unit 8 through interfaces such as a DVP interface (Digital Video Port) or LVDS (Low Voltage Differential Signaling) interface, if only one frame of image data is remained in the storage area 13 of the image sensor 3, only one frame of image data is needed to be discarded when new optical information is collected, the second frame and the later image data are the image data of the new optical information, and the second frame and the later image data are decoded until the decoding is successful or the decoding is overtime; since the optical information is decoded from the second frame, the processing time and the light supplementing time of one frame of image data are saved, and the decoding speed can be improved and the power consumption can be reduced, compared with the previous method in which the optical information is decoded from the third frame. In these specific embodiments, since the image data is not optimally processed by theimage signal processor 7, a certain image data processing time can be saved theoretically.
Specifically, referring to the timing chart 400 as in fig. 7, there are shown a trigger signal 401 triggered externally, a light-supplementing timing 402 of the light-supplementing lamp 4, an image data acquisition timing 403 of the image sensor 3 continuously acquiring image data, and a decoding timing 404 of the decoding unit 8, wherein: the triggering signal 401 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to stop collecting image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 402 and turns off light supplementing at a low level; the optical information collector 100 does not optimize the image data by the image signal processor 7, only one frame of the image data collected before remains in the storage area 13 of the image sensor 3, the optical information collector 100 discards the first frame of image data, the dashed arrow in fig. 7 represents the output of the second frame of image data to the decoding unit 8 for decoding, the decoding unit 8 receives and decodes the second frame of image data at the time point g, successfully decodes the second frame of image data at the time point h, and feeds back the successfully decoded information to the central processor 5, and the image sensor 3 stops collecting the image data and controls the light supplementing lamp 4 to stop light supplementing at the time point i due to the delay of the signal. It should be noted that, when the ambient light is sufficient, the light supplement is not necessary. As can be seen from the imagedata acquisition timing 403, six frames of image data have been acquired by theimage sensor 3 at this time, and the sixth frame of image data will remain in the storage area 13 of theimage sensor 3. When theoptical information collector 100 collects new optical information again through triggering, theoptical information collector 100 discards the remaining one frame of image data in theimage sensor 3 again, and starts decoding output from the second frame of image data, so as to avoid decoding errors.
In the method, when new optical information is collected each time, one or two frames of residual image data are discarded as required, so that the problem of residual image data in theimage sensor 3 or theimage signal processor 7 can be solved; alternatively, more than two frames of survivor data may be discarded as desired.
The above method has a certain defect that theimage sensor 3 needs to output and discard one or more residual image data each time new optical information is acquired, and thedecoding unit 8 starts decoding at least the second frame of image data, thereby wasting time; it is conceivable that the efficiency can be improved if the first frame of image data output from theimage sensor 3 is valid image data (image data of new optical information) every time new optical information is acquired.
It is conceivable that if the storage area 13 of theimage sensor 3 or theregister 14 of theimage signal processor 7 is emptied after each decoding success, so that no residual image data information exists when new optical information is acquired next time, the first frame image data is the image data of the new optical information, and thus decoding can be directly started from the first frame image data, and the decoding speed is improved. This can be achieved by means of a preset algorithm, i.e. by means of an algorithmic control, after each decoding success the memory area 13 of theimage sensor 3 or theregister 14 of theimage signal processor 7 continues to be emptied. Such a preset algorithm for clearing the memory area 13 of theimage sensor 3 or theregister 14 of theimage signal processor 7 generally requires that the algorithm be preset by the manufacturer of theimage sensor 3 or the manufacturer of the image signal processor 7 (or the manufacturer of thecentral processor 5 in which theimage signal processor 7 is integrated, the same applies hereinafter). For the manufacturer of theoptical information collector 100, the purchasedimage sensor 3 or theimage signal processor 7, the arithmetic logic for processing the image data is usually predefined by the manufacturer of theimage sensor 3 or theimage signal processor 7, and is not easy to change, that is, when the manufacturer predefines that the last frame of image data is still stored in theimage sensor 3 or theimage signal processor 7 when the image data stored in theimage sensor 3 or theimage signal processor 7 is not decoded, then the manufacturer of theoptical information collector 100 has difficulty in changing or directly eliminating the image data remaining in theimage signal processor 7. Moreover, since theimage sensors 3 produced by different manufacturers of theimage sensors 3 have different operation logics, and theimage signal processors 7 produced by different manufacturers of theimage signal processors 7 also have different operation logics, even if the manufacturers of theoptical information collector 100 can debug theimage sensor 3 or theimage signal processors 7 to directly eliminate the residual image data, theimage sensor 3 or theimage signal processors 7 need to be debugged again after being replaced, the workload is huge, and if a portable method is provided, the residual image data in theimage sensors 3 or theimage signal processors 7 of different models can be emptied, so that the workload can be saved.
In a block diagram of an alternative embodiment shown in fig. 8, the surviving image data is eliminated by bypassing the image data processing flow predefined by the manufacturer of theimage signal processor 7. Theoptical information collector 100 does not perform the optimization processing on the image data by theimage signal processor 7, the image data collected by theimage sensor 3 is output to theimage signal processor 7 by the existing MIPI interface, and stored in abuffer 15 that can be configured separately by the manufacturer of theoptical information collector 100, thebuffer 15 is integrated in theimage signal processor 7, and of course, thebuffer 15 may be set independently of theimage signal processor 7, and thedecoding unit 8 may take out the image data from thebuffer 15 and decode the image data. In this embodiment, the image data collected by theimage sensor 3 is still transmitted to theimage signal processor 7 and then to thedecoding unit 8 through the existing MIPI interface, because the image data is transmitted through the existing MIPI interface relatively simply. In some embodiments, theimage signal processor 7 may be completely bypassed, i.e. the image data acquired by theimage sensor 3 is directly transmitted to thedecoding unit 8 for decoding. Since the image data is not subjected to the optimization processing by theimage signal processor 7, only one frame of image data remains in the memory area 13 of theimage sensor 3, a specific flow can be set, eliminating one frame of image data remaining in theimage sensor 3.
In an alternative embodiment, after the original decoding process is finished, for example, after the decoding is successful, thecentral processing unit 5 sends a finishing instruction to control theimage sensor 3 to finish collecting the image data, thecentral processing unit 5 sends an instruction to theimage sensor 3 again, controls theimage sensor 3 to continuously collect one or more frames of image data, preferably, collects one frame of image data, and controls theimage sensor 3 to continuously output the one frame of image data, so that the image data in the storage area 13 of theimage sensor 3 is cleared, so that new optical information is collected next time, and the first frame of image data output by theimage sensor 3 is the image data of the new optical information. And the last frame of image data output by theimage sensor 3 may be input into abuffer 15 that may be configured by the manufacturer of theoptical information collector 100 and further cleared, eventually eliminating the remaining image data in theimage sensor 3.
Specifically, referring to the timing diagram 500 of one embodiment in fig. 9, there are shown a trigger signal 501 of the central processor 5, a light-supplementing timing 502 of the light-supplementing lamp 4, an image data acquisition timing 503 of the image sensor 3 continuously acquiring image data, and a decoding timing 504 of the decoding unit 8, wherein: the triggering signal 501 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to stop collecting image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 502 and turns off light supplementing at a low level; the image data acquisition time sequence 503 of the image sensor 3 is synchronous with the light supplementing time sequence 502, and the image sensor 3 exposes at a high level of the image data acquisition time sequence 503 and outputs image data at a low level; in fig. 9, the dashed arrow represents that the first frame image data is output to the decoding unit 8 for decoding, the decoding unit 8 receives the first frame image data at the time point j, successfully decodes the first frame image data at the time point k, and feeds back the successfully decoded information to the central processing unit 5, and the central processing unit 5 sends a trigger signal at the time point i to control the image sensor 3 to stop collecting the image data and control the light filling lamp 4 to stop filling light. Unlike the foregoing embodiment, thecentral processor 5 will send out the control signal 510 again, and separately control theimage sensor 3 to continuously collect one frame of image data at thehigh level 530, and output this frame of image data, so that there is no remaining image data in theimage sensor 3, and once the next trigger is received, the first frame of image data collected and output by theimage sensor 3 is the image data of the new optical information, and thedecoding unit 8 can directly receive and decode the first frame of image data. At this time, thelight supplementing lamp 4 is at thelow level 520, and no light supplementing is performed, thereby saving power consumption. It should be noted that when the ambient light is sufficient, no light supplement is necessary in the whole process.
In the foregoing embodiments, the image data is continuously acquired and decoded by the digital stream mode, and when thedecoding unit 8 receives the first frame of image data and decodes the first frame of image data successfully, theimage sensor 3 has already acquired the plurality of frames of image data, for example, in thetiming chart 200, theimage sensor 3 has acquired seven frames of image data in total, and it is obvious that the acquisition of the second to seven frames of image data causes waste of power consumption. Theoptical information collector 100 produced by the present iData, honeywell and Zebra and other companies can be successfully decoded basically within the first three frames of image data, that is, at least one frame of the first three frames of image data collected by theimage sensor 3 can be successfully decoded by thedecoding unit 8. As can be seen from the foregoing, when theoptical information collector 100 successfully decodes the third frame of image data, theimage sensor 3 has collected more than three frames of image data, even six or seven frames of image data, and the collection of the fourth to seven frames of image data also requires theimage sensor 3 to operate or thelight supplementing lamp 4 to supplement light, since the fourth to seven frames of image data are not used for decoding, the collection of the fourth to seven frames of image data causes waste of power consumption. It should be noted that, in some embodiments, when the ambient light is sufficient, the light supplement is not necessary, for example, the code is scanned by a mobile phone in daily life, and the light supplement is not generally required.
In a preferred embodiment, theoptical information collector 100 may collect image data in a fixed frame mode, unlike the continuous collection of image data in a digital stream mode, in which thecentral processor 5 controls theimage sensor 3 to collect image data of a fixed frame number each time, thedecoding unit 8 decodes image data of a fixed frame number, and when the decoding of image data of a fixed frame number collected at the previous time is completed (one frame of image data is successfully decoded or all image data of a fixed frame number is failed to be decoded) or the decoding is completed, thecentral processor 5 determines whether the image data of a fixed frame number needs to be collected again, and so on until the decoding is successful or the decoding is overtime. The fixed frame mode has a time interval between the two acquisitions of the image data of the fixed frame number before and after the fixed frame mode, instead of being continuous, and leaves time for thecentral processing unit 5 to make a judgment.
Referring to a timing diagram 600 of an embodiment of fig. 10, a trigger signal 601 of the central processing unit 5, a light supplementing timing 602 of the light supplementing lamp 4, an image data acquisition timing 603 of the image sensor 3 continuously acquiring image data, and a decoding timing 604 of the decoding unit 8 are shown, wherein: the triggering signal 601 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 602 and turns off light supplementing at a low level; the image data acquisition time sequence 603 of the image sensor 3 is synchronous with the light supplementing time sequence 602, and the image sensor 3 exposes at a high level of the image data acquisition time sequence 603 and outputs image data at a low level; the four broken-line arrows from left to right in fig. 10 represent that the first to fourth frame image data, respectively, are output to the decoding unit 8 for decoding, respectively, wherein none of the first to third frame image data is successfully decoded, and the fourth frame image data is successfully decoded. As can be seen from the image dataacquisition time sequence 603, the image data acquisition time of the first three frames with the fixed frame number and the image data acquisition time of the last three frames with the fixed frame number have obvious time intervals, so that thecentral processing unit 5 can judge whether the image data of the first three frames with the fixed frame number is decoded, thereby judging whether theimage sensor 3 needs to be controlled to continuously acquire the image data of the last three frames with the fixed frame number.
The image dataacquisition time sequence 603 shows that theimage sensor 3 acquires image data in a fixed frame mode with a fixed frame number as three frames, thecentral processor 5 controls theimage sensor 3 to acquire three frames of image data with the fixed frame number first, and transmits the three frames of image data to thedecoding unit 8, when the three frames of image data are not successfully decoded, controls theimage sensor 3 to acquire three frames of image data again, and transmits the three frames of image data to thedecoding unit 8 again for decoding, and so on until decoding is successful (or decoding is overtime). As can be seen from thetiming chart 600, when the fourth frame image data acquired by theimage sensor 3 is successfully decoded, if theimage sensor 3 does not acquire three frames of image data of a fixed frame number at this time, theimage sensor 3 will continue to perform the fixed frame mode, acquire the image data of the fixed frame number, that is, continue to acquire the image data of the fifth and sixth frames, and output all the image data of the fifth and sixth frames of the fixed frame number, and then stop the image data acquisition, and there will be no image data remaining in theimage sensor 3. It is easy to understand that, in contrast, theimage sensor 3 may be controlled to stop image data acquisition after the decoding is successful, and even if theimage sensor 3 has not acquired image data of a fixed frame number, this may save power consumption to some extent, but may cause residual image data in theimage sensor 3; the image data remaining in theimage sensor 3 can be discarded the next time new optical information is acquired.
In the foregoing embodiment, since theimage sensor 3 is controlled to collect three frames of image data after the decoding (the decoding is successful or not) is completed on the first three frames of image data, there is a time interval between the collection of the first three frames of image data and the collection of the second three frames of image data, if the decoding is not successful on the first three frames of image data, there is a significant delay in collecting the last three frames of image data with fixed frames. As an improvement, alternatively, when the second frame image data of the first three frame image data is not successfully decoded or the third frame image data is input to thedecoding unit 8 for decoding, that is, theimage sensor 3 is controlled to acquire three frames of image data again, so as to achieve the balance of the decoding speed and the power consumption; the time for starting to acquire the three frames of image data can be determined according to actual requirements, so that no obvious delay exists between the acquisition of the three frames of image data.
In the foregoing embodiment, the fixed frame number of the fixed frame mode is three frames, that is, theimage sensor 3 collects three frames of image data at a time; in some embodiments, the fixed frame number may be determined according to the performance of the specificoptical information collector 100, for example, if theoptical information collector 100 can decode the image data within the previous two frames or the first frame of image data successfully, the fixed frame number in the fixed frame mode may be set to be two frames or one frame preferentially, so as to avoid the waste of power consumption caused by the image data collected in the subsequent multiple frames, and control theimage sensor 3 to collect the image data of the next frame each time after the decoding of the image data of the frame is completed and the decoding is not successful; of course, the fixed frame number may be set to two or four frames or five or more frames. In summary, in combination with the foregoing embodiments, it can be seen that, in the presentoptical information collector 100, most of the previous three frames of image data can be successfully decoded, and the fixed frame number needs to be less than or equal to the timeout time for decoding one frame of image data by thedecoding unit 8, under the existing technical conditions, the timeout time is generally set to 100ms, that is, the time for decoding one frame of image data by thedecoding unit 8 reaches 100ms and is not successfully decoded, then the decoding of this frame of image data is stopped, and the next frame of image data is decoded, so that the fixed frame number in the fixed frame mode is preferably not more than five frames (20 ms×5=100 ms), and further preferably three to five frames, so that the fixed frame number of image data acquired in the first fixed frame mode can be successfully decoded, and not too many image data can be acquired, which has a power consumption advantage over the existing digital stream mode. It is contemplated that when a particularoptical information collector 100 is in digital stream mode, more than five frames of image data are required to be successfully decoded, the fixed frame number may be set to five or more frames.
The method can adopt a mixed mode combining the advantages of a fixed frame mode and a digital stream mode, is suitable for complex application scenes, and achieves the balance of power consumption and decoding speed. For some difficult-to-identify optical information, such as high-density two-dimensional codes, DPM (Direct Part Mark) or complex text coincidence, the image data can be acquired and decoded in a fixed frame mode, and when the decoding is unsuccessful, the image data is continuously acquired in a digital stream mode for decoding; it is conceivable that this hybrid mode can also be used for simple optical information reading.
It is readily appreciated that the hybrid mode may be arranged in a variety of combinations.
For example, the camera 1 may be configured to collect image data in a fixed frame mode for a preset number of times, and then collect the image data in a digital stream mode; for example, theoptical information collector 100 collects image data of a fixed frame number by adopting a fixed frame mode, then continuously collects image data by adopting a digital stream mode, and, referring to atiming chart 700 in fig. 11, atrigger signal 701 of thecentral processing unit 5, alight supplementing timing 702 of thelight supplementing lamp 4, an imagedata collecting timing 703 of theimage sensor 3 for continuously collecting image data, and adecoding timing 704 of thedecoding unit 8 are shown, and when the decoding is not successful, theoptical information collector 100 continuously collects and decodes image data by adopting the digital stream mode, and thedecoding unit 8 successfully decodes image data by adopting the first frame image data collected in the digital stream mode.
In other embodiments, the fixed frame mode may be adopted for multiple times, and when the decoding is not successful, then the digital stream mode is adopted, for example, the fixed frame mode may be adopted for two times, then the digital stream mode is adopted, that is, three frames of image data with fixed frames are collected first for decoding, when the decoding is not successful, three frames of image data with fixed frames are continuously collected for decoding, and when the decoding is not successful, the digital stream mode is adopted for decoding; it is conceivable that the decoding may be performed in a fixed frame mode three or more times, and then in a digital stream mode when the decoding is not successful.
Since it has been described above that image data remains in theimage sensor 3 when decoding is successful in the digital stream mode, the hybrid mode may be to use the fixed frame mode first, then the digital stream mode, and finally end with the fixed frame mode in order to solve this problem.
Specifically, referring to thetiming chart 800 of one embodiment in fig. 12, thetrigger signal 801 of thecentral processing unit 5, thelight supplementing timing 802 of thelight supplementing lamp 4, the image data acquisition timing 803 of theimage sensor 3 for continuously acquiring image data, and thedecoding timing 804 of thedecoding unit 8 are shown, theoptical information acquirer 100 acquires three frames of image data in a fixed frame mode with a fixed frame number of three frames, and when the decoding is not successful, acquires and decodes image data continuously in a digital stream mode, thedecoding unit 8 successfully decodes image data in the first frame of image data acquired in the digital stream mode, and when thedecoding unit 8 successfully decodes image data in the fourth frame of image data, controls theimage sensor 3 to stop image data acquisition and thelight supplementing lamp 4 to stop light supplementing. Unlike the previous embodiment, thecentral processing unit 5 will send out the control signal 810 again, and independently control theimage sensor 3 to continuously collect one frame of image data at thehigh level 830 and output the frame of image data, so that there is no remaining image data in theimage sensor 3, and further, since theimage signal processor 7 is bypassed, there is no remaining image data in theimage signal processor 7, and the next time a new optical information is collected after triggering, the first frame of image data collected and output by theimage sensor 3 is the image data of the new optical information, and thedecoding unit 8 can directly accept and decode the first frame of image data. At this time, thelight supplementing lamp 4 is at thelow level 820, and no light supplementing is performed, thereby saving power consumption. It should be noted that when the ambient light is sufficient, no light supplement is necessary in the whole process.
It is conceivable that the mixed mode may also adopt a digital stream mode to collect and decode image data first, and then adopts a fixed frame mode after the decoding is successful, to control theimage sensor 3 to continuously collect image data of a fixed frame number, and to control theimage sensor 3 to output all image data of the fixed frame number, so that no image data remains in theimage sensor 3; in the foregoing embodiment, a special case has been described in which theimage sensor 3 continues to acquire one frame of image data after decoding is successful in the digital stream mode.
It is conceivable that, when theoptical information collector 100 collects image data in a hybrid mode, theoptical information collector 100 may perform an optimization process on the image data by using theimage signal processor 7, and in order to eliminate residual image data in theimage signal processor 7, the foregoing method of discarding residual image data of a specific frame number N may be further adopted. The number of specific discarded frames N may be determined according to the remaining image data, for example, when the remaining image data is stored in both theimage sensor 3 and theimage signal processor 7, two frames of image data need to be discarded each time the image data is re-collected; when there is no image data remaining in theimage sensor 3 and one frame of image data remains in theimage signal processor 7, only one frame of image data remaining in theimage signal processor 7 needs to be discarded every time the image data is re-acquired. Alternatively, theoptical information collector 100 may not process the image data by theimage signal processor 7, and when there is one frame of remaining image data in theimage sensor 3, only the frame of remaining image data needs to be discarded each time new image data is collected; in the hybrid mode, when the fixed frame mode is adopted and theimage sensor 3 has no residual image data, the residual image data does not need to be discarded every time the image data is collected again.
The optical information collector and the method thereof have the following beneficial effects:
1. when theimage sensor 3 acquires image data through triggering, thecentral processing unit 5 sends out an instruction to discard N frames of image data with a specific frame number, wherein the N frames of image data with the specific frame number are the image data acquired and survived through triggering last time, so that the survived image data are prevented from being decoded and output, and decoding errors are avoided.
2. Theimage sensor 3 collects and outputs image data of a fixed frame number each time in a fixed frame mode, compared with the existing method for continuously collecting and outputting image data in a digital stream mode, the method can save power consumption, avoid continuously collecting image data in the digital stream mode, and avoid wasting power consumption caused by that multi-frame image data which is continuously collected after the multi-frame image data is not used for decoding when decoding is successful.
3. Theimage sensor 3 collects image data through a digital stream mode, and does not optimize the image data through theimage signal processor 7, so that residual image data in theimage signal processor 7 is avoided; and when the decoding is successful or the decoding is overtime, theimage sensor 3 is controlled to stop continuously collecting the image data in a digital stream mode, and theimage sensor 3 is controlled to continuously collect and output the image data with fixed frame number, so that the residual image data in theimage sensor 3 is avoided, the decoding error is avoided when the optical information is collected next time, and the efficiency is improved.
The above detailed description is illustrative of the preferred embodiments of the present application and is not intended to limit the scope of the present application, so that all technical and scientific variations which employ the present invention are intended to be included within the scope of the present invention.