Drawings
FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the invention;
FIG. 2 is a flow diagram of a method of fingerprint enrollment in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of a fingerprint enrollment method according to a first embodiment of the present invention;
FIG. 4 is a schematic diagram of a pre-registration data set in accordance with a first embodiment of the present invention;
FIGS. 5A-5E are schematic diagrams of finger actions and corresponding user interface displays according to a first embodiment of the present invention;
FIG. 6 is a flow chart of a fingerprint enrollment method according to a second embodiment of the present invention;
FIG. 7 is a schematic diagram of a pre-registration data set in accordance with a second embodiment of the present invention;
FIGS. 8A-8G are schematic diagrams of finger movements and corresponding user interface displays according to a second embodiment of the present invention;
FIG. 9 is a flowchart of a fingerprint registration method according to a third embodiment of the present invention;
FIG. 10 is a schematic diagram of a pre-registration data set in accordance with a third embodiment of the present invention;
11A-11I are schematic diagrams of finger actions and corresponding user interface displays according to a third embodiment of the invention.
The reference numbers illustrate:
100: an electronic device;
110: a processor;
120: a fingerprint sensor;
130: a memory;
140: a display;
410. 420, 400, 710, 720, 730, 1010, 1020, 1030: an image;
411. 711, 1011: feature points;
500. 800 and 1100: a user interface;
510. 810 and 1110: a reference image;
511. 811, 811b, 1111: a completion area;
DW1, DW 2: a width;
f: a finger;
h. h, W: image parameters;
step: adjusting parameters of the image;
S210、S220、S225、S230、S232、S235、S240、S245、S250、
s310, S320, S331, S332, S333, S334, S340, S341, S350, S380, S610, S620, S631, S633, S632, S634, S640, S641, S650, S680, S910, S920, S922, S924, S925, S926, S940, S980: a step of;
(X1, Y1), (X2, Y2), (Xn, Yn): coordinate parameters;
(Δ x, Δ y): and (4) displacement parameters.
Detailed Description
In order that the present disclosure may be more readily understood, the following specific examples are given as illustrative of the invention which may be practiced in various ways. Further, wherever possible, the same reference numbers will be used throughout the drawings and the description to refer to the same or like parts.
Fig. 1 is a schematic diagram of an electronic device according to an embodiment of the invention. Referring to fig. 1, theelectronic device 100 includes aprocessor 110, afingerprint sensor 120, amemory 130, and adisplay 140. Theprocessor 110 is coupled to thefingerprint sensor 120, thememory 130, and thedisplay 140. Theelectronic device 100 may be an electronic product such as a smart phone (smart phone), a Notebook (NB), a tablet PC (tablet PC), and the like. In the present embodiment, theelectronic device 100 performs a fingerprint sensing operation through thefingerprint sensor 120 to acquire a fingerprint image (image) of a user. In the present embodiment, when a user puts a finger on thefingerprint sensor 120 to perform a swiping (swiping) operation, thefingerprint sensor 120 performs fingerprint sensing. Thefingerprint sensor 120 continuously acquires a plurality of swipe images and provides the images to theprocessor 110. Theprocessor 110 analyzes the swipe images to extract a plurality of feature points (feature points) from each swipe image, wherein the feature points are fingerprint feature points of the finger. Theprocessor 110 then generates fingerprint registration data based on the feature point data.
In the embodiment, thefingerprint sensor 120 obtains the swipe images one by one, and during the process of analyzing the swipe images one by theprocessor 110, theprocessor 110 correspondingly changes a finishing area of the fingerprint reference image included in a User Interface (UI) displayed on thedisplay 140 according to an analysis result of the swipe images obtained one by one. In this embodiment, the reference image in the user interface includes a completion area. The finishing area of the reference image is used to represent the coverage of the acquired fingerprint information, and the range change of the finishing area of the reference image is gradually increased and changed corresponding to the current progress of the user's swipe (i.e. the progress of acquiring the fingerprint information). Therefore, the fingerprint registration function of theelectronic device 100 of the present invention provides a good interaction effect, so that the user can know the current registration progress.
In the embodiment, theProcessor 110 is, for example, a Central Processing Unit (CPU), a System On Chip (SOC), or other Programmable general purpose or special purpose microprocessor (microprocessor), a Digital Signal Processor (DSP), a Programmable controller, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), other similar Processing devices, or a combination thereof.
In the present embodiment, thefingerprint sensor 120 is, for example, a capacitive fingerprint sensor or an optical fingerprint sensor, and the present invention is not limited to the type of thefingerprint sensor 120. In the present embodiment, the fingerprint sensing mechanism of thefingerprint sensor 120 may be sliding (swiping) sensing or pressing (pressing) sensing. It is noted that the fingerprint registration according to the embodiments of the present invention is performed by sliding type sensing. That is, during the fingerprint registration process, the user slides the finger on the sensing surface of thefingerprint sensor 120, and thefingerprint sensor 120 senses and obtains the fingerprint information of the user through the sensing surface. For example, theelectronic device 100 may be designed such that the user performs fingerprint registration by sliding a finger, that is, thefingerprint sensor 120 performs fingerprint sensing by sliding sensing, and when performing fingerprint authentication, the user performs fingerprint authentication by pressing a finger, that is, thefingerprint sensor 120 performs fingerprint sensing by pressing sensing.
In the present embodiment, thememory 130 is used to store the fingerprint data according to the embodiments of the present invention, and store the related application program for theprocessor 110 to read and execute.
In the present embodiment, theDisplay 140 is, for example, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) Display, a Micro LED (Micro LED) Display, an Organic LED (Organic LED) Display, or the like, and the type of theDisplay 140 is not limited in the present invention. In the present embodiment, when a user performs fingerprint registration, thedisplay 140 displays a corresponding user interface, and a reference image simulating a fingerprint is included in the user interface. During the process of sliding the finger on thefingerprint sensor 120, the range of the finished area of the reference image displayed on thedisplay 140 changes and increases correspondingly as the fingerprint data sensed by thefingerprint sensor 120 increases.
Fig. 2 is a flowchart of a fingerprint registration method according to an embodiment of the present invention. Referring to fig. 1 and fig. 2, the fingerprint registration method of the present embodiment can be applied to theelectronic device 100 of the embodiment of fig. 1. When the user registers a fingerprint, theelectronic device 100 performs a swipe fingerprint sensing operation through thefingerprint sensor 120 to obtain swipe images of objects (i.e., the user's fingers) one by one. In step S210, thefingerprint sensor 120 acquires a swipe image. In step S220, theprocessor 110 analyzes the swipe image to obtain a plurality of feature points of the swipe image. In step S225, theprocessor 110 determines whether the swipe image is the first swipe image. If yes, go to step S230. In step S230, theprocessor 110 generates a pre-registration data set according to the feature points of the swipe image, and analyzes and obtains the basic image parameters (h) of the pre-registration data set. In step S232, theprocessor 110 displays the completed region of the reference image on the user interface according to the basic image parameter (h). If not, go to step S235. In step S235, theprocessor 110 merges the feature points of the swipe image into the pre-registration data set to generate a merged pre-registration data set. In Step S240, theprocessor 110 analyzes the merged pre-registered data set to obtain a merged image parameter (H), and obtains an image adjustment parameter (Step ═ H-H) according to the merged image parameter (H) and the base image parameter (H), wherein the image adjustment parameter (Step) is equal to the merged image parameter (H) minus the base image parameter (H). In step S245, theprocessor 110 uses the merged image parameter (H) as a new basic image parameter (H). In Step S250, theprocessor 110 increases the range of the completion area of the reference image in the ui according to the image adjustment parameter (Step), i.e. increases the length of the completion area. In order to further understand the display manner of the user interface and the technical details of fingerprint registration, a plurality of embodiments are provided below for detailed description.
Fig. 3 is a flowchart of a fingerprint registration method according to a first embodiment of the present invention. Referring to fig. 1 and fig. 3, the fingerprint registration method of the present embodiment can be applied to theelectronic device 100 of the embodiment of fig. 1. When the user performs fingerprint registration, in step S310, theelectronic device 100 senses an object (i.e., the finger of the user) through thefingerprint sensor 120 to obtain a swipe image. In step S320, theprocessor 110 analyzes the swipe image to obtain a plurality of feature points of the swipe image. In step S331, theprocessor 110 determines whether the swipe image is the first swipe image. If yes, theprocessor 110 executes step S332. In step S332, theprocessor 110 generates a pre-registration data set based on the feature points of the swipe image, and obtains the base image parameters (h) of the pre-registration data set. In step S333, theprocessor 110 displays the completed region of the reference image according to the base image parameter (h). If not, theprocessor 110 executes step S334. In step S334, theprocessor 110 merges the feature points of the swipe image into the pre-registration data set to generate a merged pre-registration data set.
In Step S340, theprocessor 110 analyzes the merged pre-registered data set to obtain a merged image parameter (H), and obtains an image adjustment parameter (Step ═ H-H) according to the merged image parameter (H) and the base image parameter (H), wherein the image adjustment parameter (Step) is equal to the merged image parameter (H) minus the base image parameter (H). In step S341, theprocessor 110 uses the merged image parameter (H) as a new basic image parameter (H). In Step S350, theprocessor 110 increases the range of the completion area of the reference image, i.e. increases the length of the completion area, according to the image adjustment parameter (Step). In step S380, theprocessor 110 determines whether the merged image parameter (H) is greater than a preset threshold. If so, it indicates that sufficient fingerprint registration data has been obtained, theprocessor 110 terminates the fingerprint sensing operation of thefingerprint sensor 120 and stores the pre-registration data set in thememory 130 as a fingerprint registration data set to complete the fingerprint registration process. If not, theprocessor 110 executes step S310 to obtain the next swipe image.
Fig. 4 is a schematic diagram of a pre-registration data set according to a first embodiment of the invention. FIGS. 5A-5E are schematic diagrams of finger swipe motions and corresponding user interface displays, in accordance with a first embodiment of the present invention. Referring to fig. 1, fig. 4 and fig. 5A to fig. 5E, the flow of fig. 3 may also be applied to the present embodiment. In the embodiment, after thefingerprint sensor 120 obtains thefirst swipe image 410 of the finger F, theprocessor 110 analyzes theswipe image 410 to obtain a plurality of feature points 411 and a basic image parameter h of theswipe image 410. It is noted that the area of theswipe image 410 may be equal to the area of the sensing surface of thefingerprint sensor 120. In the present embodiment, the initial value of the base image parameter h may be the distance between twofeature points 411 of thefirst swipe image 410 that are farthest away in the length direction, but the present invention is not limited thereto. In another embodiment, the initial value of the basic image parameter h may also be the length of theswipe image 410, i.e. the length of the sensing surface of thefingerprint sensor 120. Theprocessor 110 generates a pre-registration data set according to the feature points 411 of theswipe image 410. Then, theprocessor 110 continues to obtain thenext swipe image 420 and obtains the feature points of theswipe image 420. In the present embodiment, theprocessor 110 merges the feature points of theswipe image 420 into the pre-registration data set (i.e., merges the feature points of theswipe images 410 and 420) to generate a newpre-registration data set 400. Theprocessor 110 calculates the merged image parameters H of the mergedpre-registered dataset 400. Similarly, the merged image parameter H may be the distance between two feature points that are farthest away in the length direction after merging theswipe images 410 and 420, or may be the length of the overlapping portion of theswipe images 410 and 420 subtracted from the sum of the lengths of the two swipe images (i.e. twice the length of the sensing surface of the fingerprint sensor 120) after merging theswipe images 410 and 420. Next, theprocessor 110 subtracts the base image parameter H from the merged image parameter H to obtain an image adjustment parameter Step (═ H-H).
That is, each time theprocessor 110 incorporates a feature point of a swipe image, theprocessor 110 calculates the new length of the mergedpre-registration data set 400 to adjust the length of thecompletion area 511 of thereference image 510 in theuser interface 500. It is noted that the width DW of thecompletion area 511 of thereference image 510 is preset and fixed. And each time the feature point data of the swipe image is added, theprocessor 110 correspondingly increases the length of thecompletion area 511. In addition, theprocessor 110 determines whether the merged image parameter H is greater than a predetermined threshold. If yes, it indicates that enough fingerprint registration data has been obtained. For example, when the merged image parameter H is greater than the preset threshold, it indicates that a sufficient number of fingerprint feature points have been acquired. It may also indicate that a sufficient number of swipe images have been taken. Therefore, theprocessor 110 stores the pre-registration data set in thememory 130 as a fingerprint registration data set to complete the fingerprint registration process.
For example, fig. 5A to 5E are taken as examples. As shown in fig. 5A, when the user places a finger F on thefingerprint sensor 120 of theelectronic device 100 and performs a swipe action, theelectronic device 100 displays areference image 510 on theuser interface 500 and acquires swipe images one by one. When the first swipe image is obtained, theelectronic device 100 displays acorresponding completion area 511 in thereference image 510, wherein the length of thecompletion area 511 corresponds to the basic image parameter h of the first swipe image. Further, as described above, the width of thefinalization area 511 is preset and fixed, and as shown in the drawing, the width of thefinalization area 511 may be equal to or greater than the width of thereference image 510. As shown in fig. 5B, when the second swipe image is obtained, theelectronic device 100 increases the length of the completingregion 511 of thereference image 510 according to the image adjustment parameter Step. As shown, the width of thecompletion area 511 is fixed. As shown in FIG. 5C, when the user's finger F leaves thefingerprint sensor 120, the length of thearea 511 of thereference image 510 stops increasing. However, since sufficient fingerprint information is not obtained yet, i.e. the merged image parameter H is not greater than the predetermined threshold, the fingerprint registration procedure is not completed yet, so theuser interface 500 still continues to display the completedarea 511 of thereference image 510 and prompts the user to swipe the finger again. Next, as shown in fig. 5D, the finger F of the user is placed on thefingerprint sensor 120 of theelectronic device 100 again and performs a swiping motion. Theelectronic device 100 obtains a new swipe image and continues to increase the length of thecompletion area 511 of thereference image 510 according to the new swipe image (i.e. the new fingerprint feature point data). Finally, as shown in fig. 5E, when the merged image parameter H is greater than the predetermined threshold, theprocessor 110 completely covers thereference image 510 with thecompletion area 511, that is, the length of thecompletion area 511 is greater than or equal to the length of thereference image 510. Thus, the fingerprint registration process is completed, and theprocessor 110 stops the fingerprint sensing operation of thefingerprint sensor 120 and generates a fingerprint registration data set from the pre-registration data set to complete the fingerprint registration process.
Fig. 6 is a flowchart of a fingerprint registration method according to a second embodiment of the present invention. Referring to fig. 1 and fig. 6, the fingerprint registration method of the present embodiment can be applied to theelectronic device 100 of the embodiment of fig. 1. When the user performs fingerprint registration, in step S610, theelectronic device 100 senses an object (i.e., the finger of the user) through thefingerprint sensor 120 to obtain a swipe image. In step S620, theprocessor 110 analyzes the swipe image to obtain a plurality of feature points of the swipe image. In step S631, theprocessor 110 determines whether the swipe image is the first swipe image. If yes, theprocessor 110 executes step S632. In step S632, theprocessor 110 obtains the coordinate parameters (X, Y) of the feature point located at the top left corner of the swipe image, generates the pre-registration data set according to the feature point of the swipe image, and obtains the base image parameter (h) of the pre-registration data set. In step S633, theprocessor 110 displays the finished area of the reference image on thedisplay 140 according to the base image parameter (h) and the coordinate parameter (X, Y). If not, theprocessor 110 executes step S634. In step S634, theprocessor 110 merges the feature points of the swipe image into the pre-registration data set to generate a merged pre-registration data set.
In Step S640, theprocessor 110 analyzes the pre-registration data set to obtain a first merged image parameter (H) and a second merged image parameter (W), and obtains an image adjustment parameter (Step H-H) according to the first merged image parameter (H) and the base image parameter (H). The first merged image parameter (H) may be a distance between two feature points that are farthest apart in the longitudinal direction from the feature points merged in the pre-registration data set, or may be a length obtained by subtracting the length of the overlapping portion of the plurality of brush images from the total length of the plurality of brush images. The second merged image parameter (W) is the distance between two feature points that are farthest away in the width direction from the merged feature point. In step S641, theprocessor 110 uses the first merged image parameter (H) as a new base image parameter (H). In Step S650, theprocessor 110 increases the range of the finished area of the reference image according to the image adjustment parameter (Step). In step S680, theprocessor 110 determines whether the first merged image parameter (H) is greater than a first preset threshold, and determines whether the second merged image parameter (W) is greater than a second preset threshold. If so, theprocessor 110 ends the fingerprint sensing operation of thefingerprint sensor 120 and generates fingerprint registration data according to the merged pre-registration data set to complete the fingerprint registration procedure. If not, theprocessor 110 executes step S610 to obtain the next swipe image.
Fig. 7 is a schematic diagram of a pre-registration data set according to a second embodiment of the invention. Fig. 8A-8G are schematic diagrams of finger actions and corresponding user interface displays according to a second embodiment of the present invention. Referring to fig. 1, fig. 7, and fig. 8A to fig. 8G, the flow of fig. 6 may also be applied to the present embodiment. In this embodiment, after thefingerprint sensor 120 obtains thefirst swipe image 710 of the finger F, theprocessor 110 analyzes theswipe image 710 to obtain a plurality of feature points 711 of theswipe image 710, and obtains the base image parameter h and the coordinate parameters of the feature point located at the top-left corner of the swipe image 710 (X1, Y1). Theprocessor 110 displays thefinished area 811 of thereference image 810 according to the coordinate parameters (X1, Y1) and the base image parameter h. It is noted that the area of theswipe image 710 will be equal to the area of the sensing surface of thefingerprint sensor 120. In the present embodiment, the base image parameter h may refer to a distance of two feature points farthest in the length direction among the feature points 711 of theswipe image 710, but the present invention is not limited thereto. In another embodiment, the base image parameter h may also refer to the length of theswipe image 710, i.e., the length of the sensing surface of thefingerprint sensor 120. Theprocessor 110 generates a pre-registration data set according to the feature points 711 of theswipe image 710. Then, theprocessor 110 continues to obtain thenext swipe image 720, and obtains the feature points of theswipe image 720. In the present embodiment, theprocessor 110 merges the feature points of theswipe image 720 into the pre-registration data set (i.e., merges the feature points of theswipe images 710 and 720) to generate the mergedpre-registration data set 700. Theprocessor 110 calculates a first merged image parameter H (i.e. maximum image length) and a second merged image parameter W (i.e. maximum image width) of the mergedpre-registered data set 700. Theprocessor 110 subtracts the base image parameter H from the first merged image parameter H to obtain an image adjustment parameter Step. Then, theprocessor 110 increases the length of thefinished area 811 of thereference image 810 according to the image adjustment parameter Step.
That is, after theprocessor 110 incorporates the feature points of a swipe image, theprocessor 110 calculates the new length of the mergedpre-registration data set 700 to adjust the length of thecompletion area 811 of thereference image 810 in theuser interface 800. It is noted that the width of thefinished area 811 of thereference image 810 is preset and fixed for each finger swipe motion. That is, the width of thefinished area 811 of thereference image 810 is increased only once per finger swipe. During each finger swiping motion, theprocessor 110 correspondingly increases the length of thecompletion area 811 every time the feature point data of the swipe image is added. In addition, theprocessor 110 merges the plurality of swipe images into thepre-registration data set 700, and theprocessor 110 determines whether the first merged image parameter H and the second merged image parameter W are greater than the first and second predetermined thresholds, respectively. If yes, it indicates that enough fingerprint registration data has been obtained. For example, when the first merged image parameter H is greater than the first preset threshold and the second merged image parameter W is greater than the second preset threshold, it indicates that a sufficient number of fingerprint feature points have been obtained. It may also indicate that a sufficient number of swipe images have been taken. Therefore, theprocessor 110 stores the pre-registration data set in thememory 130 as a fingerprint registration data set to complete the fingerprint registration process. If not, theprocessor 110 may display a prompt on the user interface via thedisplay 140 to ask the user to swipe the finger again. During the second swipe, theprocessor 110 obtains thefirst swipe image 730 of the second swipe through thefingerprint sensor 120, and theprocessor 110 extracts the feature points 711 of theswipe image 730 and incorporates them into thepre-registration data set 700. Theprocessor 110 obtains the displacement parameter (Δ X, Δ Y) according to the coordinate parameters (X1, Y1) of the feature point located at the top left corner of the swipe image 710 (i.e., the first swipe image obtained at the first swipe) and the coordinate parameters (X2, Y2) of the feature point located at the top left corner of thefirst swipe image 730 obtained at the second swipe (X2-X1 ═ Δ X, Y2-Y1 ═ Δ Y)). Theprocessor 110 determines the width range and the position of thefinished area 811 of thereference image 810 of theuser interface 800 corresponding to the increased width of the second swiping motion according to the displacement parameters (Δ x, Δ y).
In other words, during the second swipe, the user's finger F is shifted to the right or left, and theprocessor 110 determines the variation of the width range of the completedregion 811 corresponding to the second swipe through the coordinate parameters (X2, Y2) of the feature point at the leftmost position in thefirst swipe image 730 obtained during the second swipe, i.e., the displacement parameters (Δ X, Δ Y), and determines the length of the completedregion 811 corresponding to thenew portion 811b of the second swipe according to the basic image parameter h of thefirst swipe image 730. Theprocessor 110 displays the start position of theportion 811b of the completedarea 811 corresponding to the second swipe according to the coordinate parameters (X2, Y2). That is, at the time of the second swipe, the range of thecompletion area 811 in the width direction increases according to the degree of displacement of the user's finger F. Then, theprocessor 110 increases the length of theportion 811b of thefinished area 811 corresponding to the second swipe according to the later obtained swipe image and the corresponding image adjustment parameter Step. When a swipe image is obtained, theprocessor 110 determines whether the first merged image parameter H and the second merged image parameter W are greater than the first and second predetermined thresholds, respectively, to determine whether to end the fingerprint registration process.
For example, fig. 8A to 8G are taken as examples. As shown in fig. 8A, when the user presses the finger F on thefingerprint sensor 120 of theelectronic device 100 and performs a first swipe action, acorresponding completion area 811 is displayed on thereference image 810 of theuser interface 800 to correspond to the first swipe image taken by thefingerprint sensor 120. As shown in fig. 8B and 8C, during the first swiping of the finger F, the length range of thefinished area 811 of thereference image 810 is correspondingly adjusted. In addition, during the first swiping of the finger F, theprocessor 110 increases the length range of thefinished area 811 of thereference image 810 according to the image adjustment parameter Step in a manner of fixing the image width. That is, during the first swiping motion of the finger F, the width DW1 of thefinished area 811 of thereference image 810 is fixed, and the length thereof is increased after the new swiping image is obtained. As shown in fig. 8D, when the user's finger F leaves thefingerprint sensor 120, the range of thefinished area 811 of thereference image 810 will stop increasing. However, since the fingerprint registration is still incomplete, theuser interface 800 stays in thereference image 810 and the existing completedarea 811, and prompts and requests the user to swipe the finger again. Therefore, as shown in fig. 8E and 8F, the finger of the user presses thefingerprint sensor 120 of theelectronic device 100 again and performs the second swipe motion. Compared with the finger placement position during the first sliding, the position of the finger of the user is shifted to the upper right by a distance during the second sliding. After the first image of the second swipe motion is acquired, theprocessor 110 calculates the coordinate parameters (X2, Y2) of the feature point at the leftmost position, subtracts (X2, Y2) from the coordinate parameters (X1, Y1) of the feature point at the leftmost position in the first image of the swipe obtained at the first swipe motion to obtain the displacement parameters (Δ X, Δ Y) (X2-X1 ═ Δ X, Y2-Y1 ═ Δ Y), and determines the display position of the first image of the second swipe, that is, the start position of the new portion b corresponding to the second swipe in thecompletion area 811, based on (Δ X, Δ Y). As shown, during the second swiping of the finger, the range of thecomplete area 811 of thereference image 810 continues to increase (i.e., 811b), wherein theprocessor 110 determines the start position of the newly increasedportion 811b of thecomplete area 811 according to the displacement parameters (Δ x, Δ y), and increases the length range of thecomplete area 811 of thereference image 810 corresponding to the second swiping of theportion 811b according to the image adjustment parameter Step in a manner of fixing the image width. That is, during the second swiping motion of the finger F, the width DW2 of the newly addedportion 811b of the completedregion 811 of thereference image 810 is fixed, and the length thereof is increased after the new swiping image is obtained. Finally, as shown in fig. 8G, when the first merged image parameter H and the second merged image parameter W of theenrollment data set 700 are respectively greater than the first and second predetermined thresholds, the range of thecompletion area 811 of thereference image 810 is increased to a sufficient length and width, so that theprocessor 110 stops the fingerprint sensing operation of thefingerprint sensor 120 to complete the fingerprint enrollment process.
Fig. 9 is a flowchart of a fingerprint registration method according to a third embodiment of the present invention. Referring to fig. 1 and fig. 9, the fingerprint registration method of the present embodiment can be applied to theelectronic device 100 of the embodiment of fig. 1. When the user performs fingerprint registration, in step S910, theelectronic device 100 senses an object (i.e. the finger of the user) by thefingerprint sensor 120 to obtain a swipe image. In step S920, theprocessor 110 analyzes the swipe image to obtain a plurality of feature points of the swipe image, and obtains coordinate parameters (X, Y) of the feature point located at the uppermost left corner of the swipe image. In step S922, theprocessor 110 determines whether the swipe image is the first swipe image. If yes, go to step S924. In step S924, theprocessor 110 generates a pre-registration data set according to the feature points of the swipe image. Next, in step S925, theprocessor 110 displays a corresponding finished area on the reference image of the ui according to the coordinate parameters (X, Y) and the area of the swipe image. Notably, the area of the swipe image is equivalent to the area of the sensing surface of thefingerprint sensor 120. If not, go to step S926. In step S926, theprocessor 110 incorporates the feature points of the swipe image into the pre-registration data set. In step S940, theprocessor 110 increases the range of the completion area according to the coordinate parameters (X, Y) and the area of the swipe image. In step S980, theprocessor 110 determines whether the total area of the pre-registration data set is greater than a preset threshold. The total area of the pre-registration data set may represent the sum of the areas of all the swipe images minus the area in which the swipe images overlap, or may represent the number of feature points included in the pre-registration data set. In other words, in the embodiment, in step S980, theprocessor 110 determines whether the number of feature points included in the pre-registration data set is greater than a preset threshold. If so, theprocessor 110 ends the fingerprint sensing operation of thefingerprint sensor 120 and generates fingerprint registration data according to the merged pre-registration data set to complete the fingerprint registration procedure. If not, theprocessor 110 executes step S910 to sense and obtain the next swipe image.
Fig. 10 is a diagram illustrating pre-registration data according to a third embodiment of the present invention. FIGS. 11A-11I are schematic diagrams of a finger swipe action and corresponding user interface display according to a third embodiment of the present invention. Please refer to fig. 1, fig. 10, and fig. 11A to fig. 11I in combination. In addition, the flow of fig. 9 may also be applied to this embodiment. In this embodiment, after thefingerprint sensor 120 obtains thefirst swipe image 1010 of the finger F, theprocessor 110 analyzes theswipe image 1010 to obtain a plurality offeature points 1011 of theswipe image 1010, and obtains the coordinate parameters (X1, Y1) of the feature points 1011 located at the top-left corner of theswipe image 1010. As shown in fig. 11A, theprocessor 110 displays afinishing region 1111 of thereference image 1110 in theuser interface 1100 according to the coordinate parameters (X1, Y1) and the area of the swipe image 1010 (i.e., the area of the sensing surface of the fingerprint sensor 120). In addition, theprocessor 110 generates pre-registration data according to the feature points of theswipe image 1010.
Next, theprocessor 110 obtains and analyzes the next swipe image 1020, and obtains a plurality offeature points 1011 of the swipe image 1020. In the present embodiment, theprocessor 110 compares the feature points of the slidingbrush images 1010 and 1020, finds the feature points included in both the slidingbrush images 1010 and 1020 to obtain the relative position relationship between the slidingbrush images 1010 and 1020, and obtains the coordinate parameters (X2, Y2) of the feature point located at the leftmost upper corner of the sliding brush image 1020. As shown in FIG. 11B, theprocessor 110 increases the display range of thefinished region 1111 of thereference image 1110 according to the coordinate parameters (X2, Y2) and the area of the slid image 1020. In addition, theprocessor 110 incorporates the feature points of the swipe image 1020 into the pre-registration data to generate the merged pre-registration data.
That is, each time theprocessor 110 obtains a swipe image, its feature points are incorporated into the pre-registration data. In addition, theprocessor 110 obtains the coordinate parameters of the feature point located at the top left corner of the sliding image for determining the increased range and position of thecompletion area 1111 of thereference image 1110 of theuser interface 1100. It is noted that theprocessor 110 determines whether to terminate fingerprint registration by determining whether the total area of the pre-registration data is larger than a predetermined threshold. If the total area of the pre-registered data is not greater than the predetermined threshold, theprocessor 110 will sense and obtain the next swipe image. As shown in fig. 10 and 11E to 11F, during the fingerprint registration process, after the user slides the finger for the first time, the finger leaves thefingerprint sensor 120. If sufficient fingerprint data is not obtained, i.e. the total area of the pre-registered data is not greater than the predetermined threshold, theprocessor 110 displays a prompt on the user interface via thedisplay 140 to ask the user to swipe the finger again. During the second swipe, theprocessor 110 obtains thefirst swipe image 1030 of the second swipe, and theprocessor 110 takes the feature points of theswipe image 1030 and incorporates them into the pre-registered data set, finds the coordinate parameters (Xn, Yn) of the feature point located at the uppermost left corner of theswipe image 1030, and increases the range of thecompletion area 1111 of thereference image 1110 of theuser interface 1100 according to the coordinate parameters (Xn, Yn) and the area of theswipe image 1030.
Specifically, by comparing and analyzing the pre-registered data and the feature points of the swipe image 1030 (i.e. finding the feature points that appear repeatedly), theprocessor 110 can calculate the relative position relationship between theswipe image 1030 and the previously obtained swipe image, and accordingly obtain the coordinate parameters (Xn, Yn). In other words, theprocessor 110 corresponds to thedisplay completion area 1111 in thereference image 1110 according to the relative position relationship between theswipe image 1030 and the previously obtained swipe image. Furthermore, theprocessor 110 determines whether the total area of the new pre-registration data is larger than a predetermined threshold to determine whether to end the fingerprint registration.
For example, fig. 11A to 11I are taken as examples. As shown in FIG. 11A, when the user places a finger F on thefingerprint sensor 120 of theelectronic device 100 and performs a swipe action, afinishing area 1111 is displayed on thereference image 1110 of theuser interface 1100 to correspond to the swipe image taken via thefingerprint sensor 120. As shown in fig. 11B to 11D, during the process of swiping the finger F, the range of thecompletion area 1111 of thereference image 1110 is adjusted accordingly. As shown in FIG. 11E, when the user's finger F leaves thefingerprint sensor 120, the extent of thefinished area 1111 of thereference image 1110 will stop increasing. However, since fingerprint registration is still incomplete, theuser interface 1100 stays in the previous completedarea 1111 of thereference image 1110, and prompts and asks the user to swipe the finger again. As shown in fig. 11F to 11H, the finger F of the user is again placed on thefingerprint sensor 120 of theelectronic device 100 and performs a swiping motion. The range of thefinishing region 1111 of thereference image 1110 is increased continuously according to the user's swiping motion. As shown in fig. 11I, when the total area of the pre-registered data is larger than the predetermined threshold, theprocessor 110 determines that sufficient fingerprint data has been acquired, and the range of thecompletion area 1111 of thereference image 1110 is increased to a sufficient area to cover the sufficient range of thereference image 1110. Therefore, theprocessor 110 stops the fingerprint sensing operation of thefingerprint sensor 120 and generates fingerprint registration data according to the pre-registration data, thereby completing the fingerprint registration process.
In summary, the user interface display method and the electronic device of the present invention can collect a plurality of swipe images obtained by one or more swipe motions of a finger of a user on a fingerprint sensor, and combine feature point data of the swipe images to generate fingerprint registration data. When merging the feature point data of the sliding brush images, the electronic device further analyzes the repeatability and the position relation among the feature points of the sliding brush images to obtain corresponding image parameters and/or coordinate parameters. Therefore, the method for displaying the user interface and the electronic device of the invention can correspondingly display the user interface with the reference image and the completion area thereof on the display according to the image parameter and/or the coordinate parameter, so as to dynamically adjust the range of the completion area of the reference image in the user interface. That is, in the process of swiping a finger for fingerprint registration, the user can know the progress of fingerprint registration through the range change of the completion area of the reference image in the user interface displayed on the display of the electronic device. Therefore, in the process of fingerprint registration by sliding the finger of the user, the display method of the user interface and the electronic device can provide real-time related information of the fingerprint registration progress to the user, thereby providing a more humanized and more convenient fingerprint registration program.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined by the appended claims.