Movatterモバイル変換


[0]ホーム

URL:


CN109669651B - Display method of user interface and electronic device - Google Patents

Display method of user interface and electronic device
Download PDF

Info

Publication number
CN109669651B
CN109669651BCN201810349409.2ACN201810349409ACN109669651BCN 109669651 BCN109669651 BCN 109669651BCN 201810349409 ACN201810349409 ACN 201810349409ACN 109669651 BCN109669651 BCN 109669651B
Authority
CN
China
Prior art keywords
image
swipe
processor
feature points
fingerprint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810349409.2A
Other languages
Chinese (zh)
Other versions
CN109669651A (en
Inventor
江元麟
吕俊超
许献仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Egis Technology Inc
Original Assignee
Egis Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Egis Technology IncfiledCriticalEgis Technology Inc
Priority to US16/105,987priorityCriticalpatent/US10713463B2/en
Priority to PCT/CN2018/110264prioritypatent/WO2019076272A1/en
Priority to JP2019551393Aprioritypatent/JP6836662B2/en
Priority to GB2117618.5Aprioritypatent/GB2599288B/en
Priority to GB1914055.7Aprioritypatent/GB2574973B/en
Priority to US16/360,017prioritypatent/US10755068B2/en
Publication of CN109669651ApublicationCriticalpatent/CN109669651A/en
Application grantedgrantedCritical
Publication of CN109669651BpublicationCriticalpatent/CN109669651B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention provides a display method of a user interface and an electronic device. The display method of the user interface is suitable for fingerprint registration. The display method comprises the following steps: sensing an object by a fingerprint sensor to obtain a swipe image of the object; analyzing the sliding brush image to obtain a plurality of characteristic points of the sliding brush image; generating a pre-registration data set according to the swipe image; analyzing the pre-registered data set to obtain image adjustment parameters; and displaying the user interface, and adjusting a range of a completion area of the reference image in the user interface according to the image adjustment parameter. Therefore, the display method of the user interface and the electronic device of the invention can enable the user to know the corresponding fingerprint registration progress through the display of the electronic device in the process of fingerprint registration in a sliding brush mode.

Description

Display method of user interface and electronic device
Technical Field
The present invention relates to an interface display technology, and more particularly, to a display method of a user interface suitable for fingerprint registration and an electronic device using the same.
Background
In recent years, fingerprint identification technology is widely applied to various electronic devices to provide various identity registration or identity verification functions. However, the general fingerprint recognition technology registers a fingerprint by pressing a finger against a fingerprint sensor with one-time pressing or multiple pressing, and provides a corresponding user interface to inform the user of the progress of fingerprint registration. For example, if a fingerprint is registered in a multi-press manner, each time the user presses one time, the corresponding fingerprint image displayed on the user interface is increased until the entire or sufficiently large range of fingerprints is displayed, indicating that the fingerprint registration is completed.
However, if the user registers the fingerprint by swiping the finger, the conventional fingerprint identification related art cannot correspondingly display the fingerprint image on the user interface according to the swiping progress of the finger of the user to inform the user of the fingerprint registration progress. That is, during the process of swiping a finger to perform fingerprint registration, the user cannot know the progress of fingerprint registration in real time.
Disclosure of Invention
The invention provides a display method of a user interface and an electronic device, which can enable a user to know the corresponding fingerprint registration progress through a display of the electronic device in the process of fingerprint registration in a sliding brush mode.
The display method of the user interface is suitable for fingerprint registration. The display method comprises the following steps: acquiring a sliding brush image through a fingerprint sensor; analyzing the sliding brush image to obtain a plurality of characteristic points of the sliding brush image; judging whether the slide brush image is the first slide brush image; if the swipe image is the first swipe image, generating a pre-registration data set according to the plurality of feature points of the swipe image, and analyzing the pre-registration data set to obtain a basic image parameter; and displaying the completion area on a reference image in the user interface according to the basic image parameter.
The display method of the user interface is suitable for fingerprint registration. The display method comprises the following steps: acquiring a sliding brush image through a fingerprint sensor; analyzing the sliding brush image to obtain a plurality of characteristic points of the sliding brush image and coordinate parameters of the characteristic points positioned at the leftmost upper corner of the sliding brush image; judging whether the slide brush image is the first slide brush image; if the swipe image is the first swipe image, generating a pre-registration data set according to the plurality of feature points of the swipe image; and displaying the finished area on a reference image in the user interface according to the coordinate parameter and the area of the swipe image.
An electronic device of the present invention includes a fingerprint sensor, a processor, and a display. The fingerprint sensor is used for acquiring a sliding brush image. The processor is coupled to the fingerprint sensor. The processor is used for analyzing the sliding brush image to obtain a plurality of characteristic points of the sliding brush image and judging whether the sliding brush image is the first sliding brush image. A display is coupled to the processor. If the processor judges that the swipe image is the first swipe image, the processor generates a pre-registration data set according to the feature points of the swipe image, and analyzes the pre-registration data set to obtain the basic image parameters. The processor displays the completion area on a reference image in the user interface through the display according to the base image parameter.
An electronic device of the present invention includes a fingerprint sensor, a processor, and a display. The fingerprint sensor is used for acquiring a sliding brush image. The processor is coupled to the fingerprint sensor. The processor is used for analyzing the sliding brush image and obtaining the coordinate parameters of the feature points positioned at the leftmost upper corner of the sliding brush image. The processor is further configured to determine whether the swipe image is a first swipe image. A display is coupled to the processor. And if the processor judges that the slide-brushing image is the first slide-brushing image, generating a pre-registration data set according to the characteristic points of the slide-brushing image. The processor displays the completion area on a reference image in the user interface according to the coordinate parameter and the area of the swipe image.
Based on the above, the method for displaying the user interface and the electronic device of the present invention can obtain the corresponding image adjustment parameters by analyzing the plurality of swipe images obtained during the fingerprint registration process, and can present the range change of the completion area of the reference image in the user interface according to the image adjustment parameters to provide the real-time information of the fingerprint registration progress of the user.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the invention;
FIG. 2 is a flow diagram of a method of fingerprint enrollment in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of a fingerprint enrollment method according to a first embodiment of the present invention;
FIG. 4 is a schematic diagram of a pre-registration data set in accordance with a first embodiment of the present invention;
FIGS. 5A-5E are schematic diagrams of finger actions and corresponding user interface displays according to a first embodiment of the present invention;
FIG. 6 is a flow chart of a fingerprint enrollment method according to a second embodiment of the present invention;
FIG. 7 is a schematic diagram of a pre-registration data set in accordance with a second embodiment of the present invention;
FIGS. 8A-8G are schematic diagrams of finger movements and corresponding user interface displays according to a second embodiment of the present invention;
FIG. 9 is a flowchart of a fingerprint registration method according to a third embodiment of the present invention;
FIG. 10 is a schematic diagram of a pre-registration data set in accordance with a third embodiment of the present invention;
11A-11I are schematic diagrams of finger actions and corresponding user interface displays according to a third embodiment of the invention.
The reference numbers illustrate:
100: an electronic device;
110: a processor;
120: a fingerprint sensor;
130: a memory;
140: a display;
410. 420, 400, 710, 720, 730, 1010, 1020, 1030: an image;
411. 711, 1011: feature points;
500. 800 and 1100: a user interface;
510. 810 and 1110: a reference image;
511. 811, 811b, 1111: a completion area;
DW1, DW 2: a width;
f: a finger;
h. h, W: image parameters;
step: adjusting parameters of the image;
S210、S220、S225、S230、S232、S235、S240、S245、S250、
s310, S320, S331, S332, S333, S334, S340, S341, S350, S380, S610, S620, S631, S633, S632, S634, S640, S641, S650, S680, S910, S920, S922, S924, S925, S926, S940, S980: a step of;
(X1, Y1), (X2, Y2), (Xn, Yn): coordinate parameters;
(Δ x, Δ y): and (4) displacement parameters.
Detailed Description
In order that the present disclosure may be more readily understood, the following specific examples are given as illustrative of the invention which may be practiced in various ways. Further, wherever possible, the same reference numbers will be used throughout the drawings and the description to refer to the same or like parts.
Fig. 1 is a schematic diagram of an electronic device according to an embodiment of the invention. Referring to fig. 1, theelectronic device 100 includes aprocessor 110, afingerprint sensor 120, amemory 130, and adisplay 140. Theprocessor 110 is coupled to thefingerprint sensor 120, thememory 130, and thedisplay 140. Theelectronic device 100 may be an electronic product such as a smart phone (smart phone), a Notebook (NB), a tablet PC (tablet PC), and the like. In the present embodiment, theelectronic device 100 performs a fingerprint sensing operation through thefingerprint sensor 120 to acquire a fingerprint image (image) of a user. In the present embodiment, when a user puts a finger on thefingerprint sensor 120 to perform a swiping (swiping) operation, thefingerprint sensor 120 performs fingerprint sensing. Thefingerprint sensor 120 continuously acquires a plurality of swipe images and provides the images to theprocessor 110. Theprocessor 110 analyzes the swipe images to extract a plurality of feature points (feature points) from each swipe image, wherein the feature points are fingerprint feature points of the finger. Theprocessor 110 then generates fingerprint registration data based on the feature point data.
In the embodiment, thefingerprint sensor 120 obtains the swipe images one by one, and during the process of analyzing the swipe images one by theprocessor 110, theprocessor 110 correspondingly changes a finishing area of the fingerprint reference image included in a User Interface (UI) displayed on thedisplay 140 according to an analysis result of the swipe images obtained one by one. In this embodiment, the reference image in the user interface includes a completion area. The finishing area of the reference image is used to represent the coverage of the acquired fingerprint information, and the range change of the finishing area of the reference image is gradually increased and changed corresponding to the current progress of the user's swipe (i.e. the progress of acquiring the fingerprint information). Therefore, the fingerprint registration function of theelectronic device 100 of the present invention provides a good interaction effect, so that the user can know the current registration progress.
In the embodiment, theProcessor 110 is, for example, a Central Processing Unit (CPU), a System On Chip (SOC), or other Programmable general purpose or special purpose microprocessor (microprocessor), a Digital Signal Processor (DSP), a Programmable controller, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), other similar Processing devices, or a combination thereof.
In the present embodiment, thefingerprint sensor 120 is, for example, a capacitive fingerprint sensor or an optical fingerprint sensor, and the present invention is not limited to the type of thefingerprint sensor 120. In the present embodiment, the fingerprint sensing mechanism of thefingerprint sensor 120 may be sliding (swiping) sensing or pressing (pressing) sensing. It is noted that the fingerprint registration according to the embodiments of the present invention is performed by sliding type sensing. That is, during the fingerprint registration process, the user slides the finger on the sensing surface of thefingerprint sensor 120, and thefingerprint sensor 120 senses and obtains the fingerprint information of the user through the sensing surface. For example, theelectronic device 100 may be designed such that the user performs fingerprint registration by sliding a finger, that is, thefingerprint sensor 120 performs fingerprint sensing by sliding sensing, and when performing fingerprint authentication, the user performs fingerprint authentication by pressing a finger, that is, thefingerprint sensor 120 performs fingerprint sensing by pressing sensing.
In the present embodiment, thememory 130 is used to store the fingerprint data according to the embodiments of the present invention, and store the related application program for theprocessor 110 to read and execute.
In the present embodiment, theDisplay 140 is, for example, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) Display, a Micro LED (Micro LED) Display, an Organic LED (Organic LED) Display, or the like, and the type of theDisplay 140 is not limited in the present invention. In the present embodiment, when a user performs fingerprint registration, thedisplay 140 displays a corresponding user interface, and a reference image simulating a fingerprint is included in the user interface. During the process of sliding the finger on thefingerprint sensor 120, the range of the finished area of the reference image displayed on thedisplay 140 changes and increases correspondingly as the fingerprint data sensed by thefingerprint sensor 120 increases.
Fig. 2 is a flowchart of a fingerprint registration method according to an embodiment of the present invention. Referring to fig. 1 and fig. 2, the fingerprint registration method of the present embodiment can be applied to theelectronic device 100 of the embodiment of fig. 1. When the user registers a fingerprint, theelectronic device 100 performs a swipe fingerprint sensing operation through thefingerprint sensor 120 to obtain swipe images of objects (i.e., the user's fingers) one by one. In step S210, thefingerprint sensor 120 acquires a swipe image. In step S220, theprocessor 110 analyzes the swipe image to obtain a plurality of feature points of the swipe image. In step S225, theprocessor 110 determines whether the swipe image is the first swipe image. If yes, go to step S230. In step S230, theprocessor 110 generates a pre-registration data set according to the feature points of the swipe image, and analyzes and obtains the basic image parameters (h) of the pre-registration data set. In step S232, theprocessor 110 displays the completed region of the reference image on the user interface according to the basic image parameter (h). If not, go to step S235. In step S235, theprocessor 110 merges the feature points of the swipe image into the pre-registration data set to generate a merged pre-registration data set. In Step S240, theprocessor 110 analyzes the merged pre-registered data set to obtain a merged image parameter (H), and obtains an image adjustment parameter (Step ═ H-H) according to the merged image parameter (H) and the base image parameter (H), wherein the image adjustment parameter (Step) is equal to the merged image parameter (H) minus the base image parameter (H). In step S245, theprocessor 110 uses the merged image parameter (H) as a new basic image parameter (H). In Step S250, theprocessor 110 increases the range of the completion area of the reference image in the ui according to the image adjustment parameter (Step), i.e. increases the length of the completion area. In order to further understand the display manner of the user interface and the technical details of fingerprint registration, a plurality of embodiments are provided below for detailed description.
Fig. 3 is a flowchart of a fingerprint registration method according to a first embodiment of the present invention. Referring to fig. 1 and fig. 3, the fingerprint registration method of the present embodiment can be applied to theelectronic device 100 of the embodiment of fig. 1. When the user performs fingerprint registration, in step S310, theelectronic device 100 senses an object (i.e., the finger of the user) through thefingerprint sensor 120 to obtain a swipe image. In step S320, theprocessor 110 analyzes the swipe image to obtain a plurality of feature points of the swipe image. In step S331, theprocessor 110 determines whether the swipe image is the first swipe image. If yes, theprocessor 110 executes step S332. In step S332, theprocessor 110 generates a pre-registration data set based on the feature points of the swipe image, and obtains the base image parameters (h) of the pre-registration data set. In step S333, theprocessor 110 displays the completed region of the reference image according to the base image parameter (h). If not, theprocessor 110 executes step S334. In step S334, theprocessor 110 merges the feature points of the swipe image into the pre-registration data set to generate a merged pre-registration data set.
In Step S340, theprocessor 110 analyzes the merged pre-registered data set to obtain a merged image parameter (H), and obtains an image adjustment parameter (Step ═ H-H) according to the merged image parameter (H) and the base image parameter (H), wherein the image adjustment parameter (Step) is equal to the merged image parameter (H) minus the base image parameter (H). In step S341, theprocessor 110 uses the merged image parameter (H) as a new basic image parameter (H). In Step S350, theprocessor 110 increases the range of the completion area of the reference image, i.e. increases the length of the completion area, according to the image adjustment parameter (Step). In step S380, theprocessor 110 determines whether the merged image parameter (H) is greater than a preset threshold. If so, it indicates that sufficient fingerprint registration data has been obtained, theprocessor 110 terminates the fingerprint sensing operation of thefingerprint sensor 120 and stores the pre-registration data set in thememory 130 as a fingerprint registration data set to complete the fingerprint registration process. If not, theprocessor 110 executes step S310 to obtain the next swipe image.
Fig. 4 is a schematic diagram of a pre-registration data set according to a first embodiment of the invention. FIGS. 5A-5E are schematic diagrams of finger swipe motions and corresponding user interface displays, in accordance with a first embodiment of the present invention. Referring to fig. 1, fig. 4 and fig. 5A to fig. 5E, the flow of fig. 3 may also be applied to the present embodiment. In the embodiment, after thefingerprint sensor 120 obtains thefirst swipe image 410 of the finger F, theprocessor 110 analyzes theswipe image 410 to obtain a plurality of feature points 411 and a basic image parameter h of theswipe image 410. It is noted that the area of theswipe image 410 may be equal to the area of the sensing surface of thefingerprint sensor 120. In the present embodiment, the initial value of the base image parameter h may be the distance between twofeature points 411 of thefirst swipe image 410 that are farthest away in the length direction, but the present invention is not limited thereto. In another embodiment, the initial value of the basic image parameter h may also be the length of theswipe image 410, i.e. the length of the sensing surface of thefingerprint sensor 120. Theprocessor 110 generates a pre-registration data set according to the feature points 411 of theswipe image 410. Then, theprocessor 110 continues to obtain thenext swipe image 420 and obtains the feature points of theswipe image 420. In the present embodiment, theprocessor 110 merges the feature points of theswipe image 420 into the pre-registration data set (i.e., merges the feature points of theswipe images 410 and 420) to generate a newpre-registration data set 400. Theprocessor 110 calculates the merged image parameters H of the mergedpre-registered dataset 400. Similarly, the merged image parameter H may be the distance between two feature points that are farthest away in the length direction after merging theswipe images 410 and 420, or may be the length of the overlapping portion of theswipe images 410 and 420 subtracted from the sum of the lengths of the two swipe images (i.e. twice the length of the sensing surface of the fingerprint sensor 120) after merging theswipe images 410 and 420. Next, theprocessor 110 subtracts the base image parameter H from the merged image parameter H to obtain an image adjustment parameter Step (═ H-H).
That is, each time theprocessor 110 incorporates a feature point of a swipe image, theprocessor 110 calculates the new length of the mergedpre-registration data set 400 to adjust the length of thecompletion area 511 of thereference image 510 in theuser interface 500. It is noted that the width DW of thecompletion area 511 of thereference image 510 is preset and fixed. And each time the feature point data of the swipe image is added, theprocessor 110 correspondingly increases the length of thecompletion area 511. In addition, theprocessor 110 determines whether the merged image parameter H is greater than a predetermined threshold. If yes, it indicates that enough fingerprint registration data has been obtained. For example, when the merged image parameter H is greater than the preset threshold, it indicates that a sufficient number of fingerprint feature points have been acquired. It may also indicate that a sufficient number of swipe images have been taken. Therefore, theprocessor 110 stores the pre-registration data set in thememory 130 as a fingerprint registration data set to complete the fingerprint registration process.
For example, fig. 5A to 5E are taken as examples. As shown in fig. 5A, when the user places a finger F on thefingerprint sensor 120 of theelectronic device 100 and performs a swipe action, theelectronic device 100 displays areference image 510 on theuser interface 500 and acquires swipe images one by one. When the first swipe image is obtained, theelectronic device 100 displays acorresponding completion area 511 in thereference image 510, wherein the length of thecompletion area 511 corresponds to the basic image parameter h of the first swipe image. Further, as described above, the width of thefinalization area 511 is preset and fixed, and as shown in the drawing, the width of thefinalization area 511 may be equal to or greater than the width of thereference image 510. As shown in fig. 5B, when the second swipe image is obtained, theelectronic device 100 increases the length of the completingregion 511 of thereference image 510 according to the image adjustment parameter Step. As shown, the width of thecompletion area 511 is fixed. As shown in FIG. 5C, when the user's finger F leaves thefingerprint sensor 120, the length of thearea 511 of thereference image 510 stops increasing. However, since sufficient fingerprint information is not obtained yet, i.e. the merged image parameter H is not greater than the predetermined threshold, the fingerprint registration procedure is not completed yet, so theuser interface 500 still continues to display the completedarea 511 of thereference image 510 and prompts the user to swipe the finger again. Next, as shown in fig. 5D, the finger F of the user is placed on thefingerprint sensor 120 of theelectronic device 100 again and performs a swiping motion. Theelectronic device 100 obtains a new swipe image and continues to increase the length of thecompletion area 511 of thereference image 510 according to the new swipe image (i.e. the new fingerprint feature point data). Finally, as shown in fig. 5E, when the merged image parameter H is greater than the predetermined threshold, theprocessor 110 completely covers thereference image 510 with thecompletion area 511, that is, the length of thecompletion area 511 is greater than or equal to the length of thereference image 510. Thus, the fingerprint registration process is completed, and theprocessor 110 stops the fingerprint sensing operation of thefingerprint sensor 120 and generates a fingerprint registration data set from the pre-registration data set to complete the fingerprint registration process.
Fig. 6 is a flowchart of a fingerprint registration method according to a second embodiment of the present invention. Referring to fig. 1 and fig. 6, the fingerprint registration method of the present embodiment can be applied to theelectronic device 100 of the embodiment of fig. 1. When the user performs fingerprint registration, in step S610, theelectronic device 100 senses an object (i.e., the finger of the user) through thefingerprint sensor 120 to obtain a swipe image. In step S620, theprocessor 110 analyzes the swipe image to obtain a plurality of feature points of the swipe image. In step S631, theprocessor 110 determines whether the swipe image is the first swipe image. If yes, theprocessor 110 executes step S632. In step S632, theprocessor 110 obtains the coordinate parameters (X, Y) of the feature point located at the top left corner of the swipe image, generates the pre-registration data set according to the feature point of the swipe image, and obtains the base image parameter (h) of the pre-registration data set. In step S633, theprocessor 110 displays the finished area of the reference image on thedisplay 140 according to the base image parameter (h) and the coordinate parameter (X, Y). If not, theprocessor 110 executes step S634. In step S634, theprocessor 110 merges the feature points of the swipe image into the pre-registration data set to generate a merged pre-registration data set.
In Step S640, theprocessor 110 analyzes the pre-registration data set to obtain a first merged image parameter (H) and a second merged image parameter (W), and obtains an image adjustment parameter (Step H-H) according to the first merged image parameter (H) and the base image parameter (H). The first merged image parameter (H) may be a distance between two feature points that are farthest apart in the longitudinal direction from the feature points merged in the pre-registration data set, or may be a length obtained by subtracting the length of the overlapping portion of the plurality of brush images from the total length of the plurality of brush images. The second merged image parameter (W) is the distance between two feature points that are farthest away in the width direction from the merged feature point. In step S641, theprocessor 110 uses the first merged image parameter (H) as a new base image parameter (H). In Step S650, theprocessor 110 increases the range of the finished area of the reference image according to the image adjustment parameter (Step). In step S680, theprocessor 110 determines whether the first merged image parameter (H) is greater than a first preset threshold, and determines whether the second merged image parameter (W) is greater than a second preset threshold. If so, theprocessor 110 ends the fingerprint sensing operation of thefingerprint sensor 120 and generates fingerprint registration data according to the merged pre-registration data set to complete the fingerprint registration procedure. If not, theprocessor 110 executes step S610 to obtain the next swipe image.
Fig. 7 is a schematic diagram of a pre-registration data set according to a second embodiment of the invention. Fig. 8A-8G are schematic diagrams of finger actions and corresponding user interface displays according to a second embodiment of the present invention. Referring to fig. 1, fig. 7, and fig. 8A to fig. 8G, the flow of fig. 6 may also be applied to the present embodiment. In this embodiment, after thefingerprint sensor 120 obtains thefirst swipe image 710 of the finger F, theprocessor 110 analyzes theswipe image 710 to obtain a plurality of feature points 711 of theswipe image 710, and obtains the base image parameter h and the coordinate parameters of the feature point located at the top-left corner of the swipe image 710 (X1, Y1). Theprocessor 110 displays thefinished area 811 of thereference image 810 according to the coordinate parameters (X1, Y1) and the base image parameter h. It is noted that the area of theswipe image 710 will be equal to the area of the sensing surface of thefingerprint sensor 120. In the present embodiment, the base image parameter h may refer to a distance of two feature points farthest in the length direction among the feature points 711 of theswipe image 710, but the present invention is not limited thereto. In another embodiment, the base image parameter h may also refer to the length of theswipe image 710, i.e., the length of the sensing surface of thefingerprint sensor 120. Theprocessor 110 generates a pre-registration data set according to the feature points 711 of theswipe image 710. Then, theprocessor 110 continues to obtain thenext swipe image 720, and obtains the feature points of theswipe image 720. In the present embodiment, theprocessor 110 merges the feature points of theswipe image 720 into the pre-registration data set (i.e., merges the feature points of theswipe images 710 and 720) to generate the mergedpre-registration data set 700. Theprocessor 110 calculates a first merged image parameter H (i.e. maximum image length) and a second merged image parameter W (i.e. maximum image width) of the mergedpre-registered data set 700. Theprocessor 110 subtracts the base image parameter H from the first merged image parameter H to obtain an image adjustment parameter Step. Then, theprocessor 110 increases the length of thefinished area 811 of thereference image 810 according to the image adjustment parameter Step.
That is, after theprocessor 110 incorporates the feature points of a swipe image, theprocessor 110 calculates the new length of the mergedpre-registration data set 700 to adjust the length of thecompletion area 811 of thereference image 810 in theuser interface 800. It is noted that the width of thefinished area 811 of thereference image 810 is preset and fixed for each finger swipe motion. That is, the width of thefinished area 811 of thereference image 810 is increased only once per finger swipe. During each finger swiping motion, theprocessor 110 correspondingly increases the length of thecompletion area 811 every time the feature point data of the swipe image is added. In addition, theprocessor 110 merges the plurality of swipe images into thepre-registration data set 700, and theprocessor 110 determines whether the first merged image parameter H and the second merged image parameter W are greater than the first and second predetermined thresholds, respectively. If yes, it indicates that enough fingerprint registration data has been obtained. For example, when the first merged image parameter H is greater than the first preset threshold and the second merged image parameter W is greater than the second preset threshold, it indicates that a sufficient number of fingerprint feature points have been obtained. It may also indicate that a sufficient number of swipe images have been taken. Therefore, theprocessor 110 stores the pre-registration data set in thememory 130 as a fingerprint registration data set to complete the fingerprint registration process. If not, theprocessor 110 may display a prompt on the user interface via thedisplay 140 to ask the user to swipe the finger again. During the second swipe, theprocessor 110 obtains thefirst swipe image 730 of the second swipe through thefingerprint sensor 120, and theprocessor 110 extracts the feature points 711 of theswipe image 730 and incorporates them into thepre-registration data set 700. Theprocessor 110 obtains the displacement parameter (Δ X, Δ Y) according to the coordinate parameters (X1, Y1) of the feature point located at the top left corner of the swipe image 710 (i.e., the first swipe image obtained at the first swipe) and the coordinate parameters (X2, Y2) of the feature point located at the top left corner of thefirst swipe image 730 obtained at the second swipe (X2-X1 ═ Δ X, Y2-Y1 ═ Δ Y)). Theprocessor 110 determines the width range and the position of thefinished area 811 of thereference image 810 of theuser interface 800 corresponding to the increased width of the second swiping motion according to the displacement parameters (Δ x, Δ y).
In other words, during the second swipe, the user's finger F is shifted to the right or left, and theprocessor 110 determines the variation of the width range of the completedregion 811 corresponding to the second swipe through the coordinate parameters (X2, Y2) of the feature point at the leftmost position in thefirst swipe image 730 obtained during the second swipe, i.e., the displacement parameters (Δ X, Δ Y), and determines the length of the completedregion 811 corresponding to thenew portion 811b of the second swipe according to the basic image parameter h of thefirst swipe image 730. Theprocessor 110 displays the start position of theportion 811b of the completedarea 811 corresponding to the second swipe according to the coordinate parameters (X2, Y2). That is, at the time of the second swipe, the range of thecompletion area 811 in the width direction increases according to the degree of displacement of the user's finger F. Then, theprocessor 110 increases the length of theportion 811b of thefinished area 811 corresponding to the second swipe according to the later obtained swipe image and the corresponding image adjustment parameter Step. When a swipe image is obtained, theprocessor 110 determines whether the first merged image parameter H and the second merged image parameter W are greater than the first and second predetermined thresholds, respectively, to determine whether to end the fingerprint registration process.
For example, fig. 8A to 8G are taken as examples. As shown in fig. 8A, when the user presses the finger F on thefingerprint sensor 120 of theelectronic device 100 and performs a first swipe action, acorresponding completion area 811 is displayed on thereference image 810 of theuser interface 800 to correspond to the first swipe image taken by thefingerprint sensor 120. As shown in fig. 8B and 8C, during the first swiping of the finger F, the length range of thefinished area 811 of thereference image 810 is correspondingly adjusted. In addition, during the first swiping of the finger F, theprocessor 110 increases the length range of thefinished area 811 of thereference image 810 according to the image adjustment parameter Step in a manner of fixing the image width. That is, during the first swiping motion of the finger F, the width DW1 of thefinished area 811 of thereference image 810 is fixed, and the length thereof is increased after the new swiping image is obtained. As shown in fig. 8D, when the user's finger F leaves thefingerprint sensor 120, the range of thefinished area 811 of thereference image 810 will stop increasing. However, since the fingerprint registration is still incomplete, theuser interface 800 stays in thereference image 810 and the existing completedarea 811, and prompts and requests the user to swipe the finger again. Therefore, as shown in fig. 8E and 8F, the finger of the user presses thefingerprint sensor 120 of theelectronic device 100 again and performs the second swipe motion. Compared with the finger placement position during the first sliding, the position of the finger of the user is shifted to the upper right by a distance during the second sliding. After the first image of the second swipe motion is acquired, theprocessor 110 calculates the coordinate parameters (X2, Y2) of the feature point at the leftmost position, subtracts (X2, Y2) from the coordinate parameters (X1, Y1) of the feature point at the leftmost position in the first image of the swipe obtained at the first swipe motion to obtain the displacement parameters (Δ X, Δ Y) (X2-X1 ═ Δ X, Y2-Y1 ═ Δ Y), and determines the display position of the first image of the second swipe, that is, the start position of the new portion b corresponding to the second swipe in thecompletion area 811, based on (Δ X, Δ Y). As shown, during the second swiping of the finger, the range of thecomplete area 811 of thereference image 810 continues to increase (i.e., 811b), wherein theprocessor 110 determines the start position of the newly increasedportion 811b of thecomplete area 811 according to the displacement parameters (Δ x, Δ y), and increases the length range of thecomplete area 811 of thereference image 810 corresponding to the second swiping of theportion 811b according to the image adjustment parameter Step in a manner of fixing the image width. That is, during the second swiping motion of the finger F, the width DW2 of the newly addedportion 811b of the completedregion 811 of thereference image 810 is fixed, and the length thereof is increased after the new swiping image is obtained. Finally, as shown in fig. 8G, when the first merged image parameter H and the second merged image parameter W of theenrollment data set 700 are respectively greater than the first and second predetermined thresholds, the range of thecompletion area 811 of thereference image 810 is increased to a sufficient length and width, so that theprocessor 110 stops the fingerprint sensing operation of thefingerprint sensor 120 to complete the fingerprint enrollment process.
Fig. 9 is a flowchart of a fingerprint registration method according to a third embodiment of the present invention. Referring to fig. 1 and fig. 9, the fingerprint registration method of the present embodiment can be applied to theelectronic device 100 of the embodiment of fig. 1. When the user performs fingerprint registration, in step S910, theelectronic device 100 senses an object (i.e. the finger of the user) by thefingerprint sensor 120 to obtain a swipe image. In step S920, theprocessor 110 analyzes the swipe image to obtain a plurality of feature points of the swipe image, and obtains coordinate parameters (X, Y) of the feature point located at the uppermost left corner of the swipe image. In step S922, theprocessor 110 determines whether the swipe image is the first swipe image. If yes, go to step S924. In step S924, theprocessor 110 generates a pre-registration data set according to the feature points of the swipe image. Next, in step S925, theprocessor 110 displays a corresponding finished area on the reference image of the ui according to the coordinate parameters (X, Y) and the area of the swipe image. Notably, the area of the swipe image is equivalent to the area of the sensing surface of thefingerprint sensor 120. If not, go to step S926. In step S926, theprocessor 110 incorporates the feature points of the swipe image into the pre-registration data set. In step S940, theprocessor 110 increases the range of the completion area according to the coordinate parameters (X, Y) and the area of the swipe image. In step S980, theprocessor 110 determines whether the total area of the pre-registration data set is greater than a preset threshold. The total area of the pre-registration data set may represent the sum of the areas of all the swipe images minus the area in which the swipe images overlap, or may represent the number of feature points included in the pre-registration data set. In other words, in the embodiment, in step S980, theprocessor 110 determines whether the number of feature points included in the pre-registration data set is greater than a preset threshold. If so, theprocessor 110 ends the fingerprint sensing operation of thefingerprint sensor 120 and generates fingerprint registration data according to the merged pre-registration data set to complete the fingerprint registration procedure. If not, theprocessor 110 executes step S910 to sense and obtain the next swipe image.
Fig. 10 is a diagram illustrating pre-registration data according to a third embodiment of the present invention. FIGS. 11A-11I are schematic diagrams of a finger swipe action and corresponding user interface display according to a third embodiment of the present invention. Please refer to fig. 1, fig. 10, and fig. 11A to fig. 11I in combination. In addition, the flow of fig. 9 may also be applied to this embodiment. In this embodiment, after thefingerprint sensor 120 obtains thefirst swipe image 1010 of the finger F, theprocessor 110 analyzes theswipe image 1010 to obtain a plurality offeature points 1011 of theswipe image 1010, and obtains the coordinate parameters (X1, Y1) of the feature points 1011 located at the top-left corner of theswipe image 1010. As shown in fig. 11A, theprocessor 110 displays afinishing region 1111 of thereference image 1110 in theuser interface 1100 according to the coordinate parameters (X1, Y1) and the area of the swipe image 1010 (i.e., the area of the sensing surface of the fingerprint sensor 120). In addition, theprocessor 110 generates pre-registration data according to the feature points of theswipe image 1010.
Next, theprocessor 110 obtains and analyzes the next swipe image 1020, and obtains a plurality offeature points 1011 of the swipe image 1020. In the present embodiment, theprocessor 110 compares the feature points of the slidingbrush images 1010 and 1020, finds the feature points included in both the slidingbrush images 1010 and 1020 to obtain the relative position relationship between the slidingbrush images 1010 and 1020, and obtains the coordinate parameters (X2, Y2) of the feature point located at the leftmost upper corner of the sliding brush image 1020. As shown in FIG. 11B, theprocessor 110 increases the display range of thefinished region 1111 of thereference image 1110 according to the coordinate parameters (X2, Y2) and the area of the slid image 1020. In addition, theprocessor 110 incorporates the feature points of the swipe image 1020 into the pre-registration data to generate the merged pre-registration data.
That is, each time theprocessor 110 obtains a swipe image, its feature points are incorporated into the pre-registration data. In addition, theprocessor 110 obtains the coordinate parameters of the feature point located at the top left corner of the sliding image for determining the increased range and position of thecompletion area 1111 of thereference image 1110 of theuser interface 1100. It is noted that theprocessor 110 determines whether to terminate fingerprint registration by determining whether the total area of the pre-registration data is larger than a predetermined threshold. If the total area of the pre-registered data is not greater than the predetermined threshold, theprocessor 110 will sense and obtain the next swipe image. As shown in fig. 10 and 11E to 11F, during the fingerprint registration process, after the user slides the finger for the first time, the finger leaves thefingerprint sensor 120. If sufficient fingerprint data is not obtained, i.e. the total area of the pre-registered data is not greater than the predetermined threshold, theprocessor 110 displays a prompt on the user interface via thedisplay 140 to ask the user to swipe the finger again. During the second swipe, theprocessor 110 obtains thefirst swipe image 1030 of the second swipe, and theprocessor 110 takes the feature points of theswipe image 1030 and incorporates them into the pre-registered data set, finds the coordinate parameters (Xn, Yn) of the feature point located at the uppermost left corner of theswipe image 1030, and increases the range of thecompletion area 1111 of thereference image 1110 of theuser interface 1100 according to the coordinate parameters (Xn, Yn) and the area of theswipe image 1030.
Specifically, by comparing and analyzing the pre-registered data and the feature points of the swipe image 1030 (i.e. finding the feature points that appear repeatedly), theprocessor 110 can calculate the relative position relationship between theswipe image 1030 and the previously obtained swipe image, and accordingly obtain the coordinate parameters (Xn, Yn). In other words, theprocessor 110 corresponds to thedisplay completion area 1111 in thereference image 1110 according to the relative position relationship between theswipe image 1030 and the previously obtained swipe image. Furthermore, theprocessor 110 determines whether the total area of the new pre-registration data is larger than a predetermined threshold to determine whether to end the fingerprint registration.
For example, fig. 11A to 11I are taken as examples. As shown in FIG. 11A, when the user places a finger F on thefingerprint sensor 120 of theelectronic device 100 and performs a swipe action, afinishing area 1111 is displayed on thereference image 1110 of theuser interface 1100 to correspond to the swipe image taken via thefingerprint sensor 120. As shown in fig. 11B to 11D, during the process of swiping the finger F, the range of thecompletion area 1111 of thereference image 1110 is adjusted accordingly. As shown in FIG. 11E, when the user's finger F leaves thefingerprint sensor 120, the extent of thefinished area 1111 of thereference image 1110 will stop increasing. However, since fingerprint registration is still incomplete, theuser interface 1100 stays in the previous completedarea 1111 of thereference image 1110, and prompts and asks the user to swipe the finger again. As shown in fig. 11F to 11H, the finger F of the user is again placed on thefingerprint sensor 120 of theelectronic device 100 and performs a swiping motion. The range of thefinishing region 1111 of thereference image 1110 is increased continuously according to the user's swiping motion. As shown in fig. 11I, when the total area of the pre-registered data is larger than the predetermined threshold, theprocessor 110 determines that sufficient fingerprint data has been acquired, and the range of thecompletion area 1111 of thereference image 1110 is increased to a sufficient area to cover the sufficient range of thereference image 1110. Therefore, theprocessor 110 stops the fingerprint sensing operation of thefingerprint sensor 120 and generates fingerprint registration data according to the pre-registration data, thereby completing the fingerprint registration process.
In summary, the user interface display method and the electronic device of the present invention can collect a plurality of swipe images obtained by one or more swipe motions of a finger of a user on a fingerprint sensor, and combine feature point data of the swipe images to generate fingerprint registration data. When merging the feature point data of the sliding brush images, the electronic device further analyzes the repeatability and the position relation among the feature points of the sliding brush images to obtain corresponding image parameters and/or coordinate parameters. Therefore, the method for displaying the user interface and the electronic device of the invention can correspondingly display the user interface with the reference image and the completion area thereof on the display according to the image parameter and/or the coordinate parameter, so as to dynamically adjust the range of the completion area of the reference image in the user interface. That is, in the process of swiping a finger for fingerprint registration, the user can know the progress of fingerprint registration through the range change of the completion area of the reference image in the user interface displayed on the display of the electronic device. Therefore, in the process of fingerprint registration by sliding the finger of the user, the display method of the user interface and the electronic device can provide real-time related information of the fingerprint registration progress to the user, thereby providing a more humanized and more convenient fingerprint registration program.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

CN201810349409.2A2017-10-162018-04-18Display method of user interface and electronic deviceActiveCN109669651B (en)

Priority Applications (6)

Application NumberPriority DateFiling DateTitle
US16/105,987US10713463B2 (en)2017-10-162018-08-21Display method of user interface and electronic apparatus thereof
PCT/CN2018/110264WO2019076272A1 (en)2017-10-162018-10-15User interface display method and electronic device
JP2019551393AJP6836662B2 (en)2017-10-162018-10-15 User interface display method and electronic device
GB2117618.5AGB2599288B (en)2017-10-162018-10-15User interface display method and electronic device
GB1914055.7AGB2574973B (en)2017-10-162018-10-15User interface display method and electronic device
US16/360,017US10755068B2 (en)2017-10-162019-03-21Fingerprint registration method and electronic device using the same

Applications Claiming Priority (4)

Application NumberPriority DateFiling DateTitle
US201762573140P2017-10-162017-10-16
US62/5731402017-10-16
US201762598480P2017-12-142017-12-14
US62/5984802017-12-14

Publications (2)

Publication NumberPublication Date
CN109669651A CN109669651A (en)2019-04-23
CN109669651Btrue CN109669651B (en)2021-02-02

Family

ID=66141976

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201810349409.2AActiveCN109669651B (en)2017-10-162018-04-18Display method of user interface and electronic device

Country Status (4)

CountryLink
JP (1)JP6836662B2 (en)
CN (1)CN109669651B (en)
GB (2)GB2574973B (en)
WO (1)WO2019076272A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
TWI735821B (en)*2018-04-122021-08-11神盾股份有限公司Fingerprint registration method and electronic device using the same
CN114998944B (en)*2021-09-022025-09-23神盾股份有限公司 Method, electronic device and computer-readable storage medium for fingerprint registration

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP3663075B2 (en)*1999-04-052005-06-22シャープ株式会社 Information processing device
JP3780830B2 (en)*2000-07-282006-05-31日本電気株式会社 Fingerprint identification method and apparatus
AU2002328437A1 (en)*2002-09-172004-04-08Fujitsu LimitedBiological information acquiring apparatus and authentication apparatus using biological information
CN1924889A (en)*2005-08-302007-03-07知网生物识别科技股份有限公司 Device and method for fingerprint registration
CN102446271B (en)*2010-10-082014-08-27金佶科技股份有限公司 Segmented Image Recognition Method and Area Recognition Device
TWI562077B (en)*2012-01-042016-12-11Gingy Technology IncMethod for fingerprint recognition using dual camera and device thereof
KR101419784B1 (en)*2013-06-192014-07-21크루셜텍 (주)Method and apparatus for recognizing and verifying fingerprint
US9898642B2 (en)*2013-09-092018-02-20Apple Inc.Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
KR102126568B1 (en)*2013-10-312020-06-24삼성전자주식회사Method for processing data and an electronic device thereof
KR102217858B1 (en)*2013-11-132021-02-19삼성전자주식회사Method for fingerprint authentication, fingerprint authentication device, and mobile terminal performing thereof
CN105989349B (en)*2014-10-242019-11-01神盾股份有限公司Fingerprint registration data generation method and electronic device
US9613428B2 (en)*2014-11-072017-04-04Fingerprint Cards AbFingerprint authentication using stitch and cut
US10032062B2 (en)*2015-04-152018-07-24Samsung Electronics Co., Ltd.Method and apparatus for recognizing fingerprint
KR102396514B1 (en)*2015-04-292022-05-11삼성전자주식회사Fingerprint information processing method and electronic device supporting the same
KR101639986B1 (en)*2015-10-072016-07-15크루셜텍 (주)Fingerprint information processing method and apparatus for speed improvement of fingerprint registration and authentification
CN105373786A (en)*2015-11-302016-03-02东莞酷派软件技术有限公司 A fingerprint collection method, device and electronic equipment
CN105814586B (en)*2016-03-172019-10-01深圳信炜科技有限公司Fingerprint register method, fingerprint recognition system and electronic equipment
CN107004131A (en)*2017-03-092017-08-01深圳市汇顶科技股份有限公司The method and device of fingerprint recognition

Also Published As

Publication numberPublication date
GB2599288B (en)2022-11-23
CN109669651A (en)2019-04-23
GB2574973B (en)2022-04-13
JP2020515955A (en)2020-05-28
WO2019076272A1 (en)2019-04-25
GB2599288A (en)2022-03-30
GB202117618D0 (en)2022-01-19
JP6836662B2 (en)2021-03-03
GB201914055D0 (en)2019-11-13
GB2574973A (en)2019-12-25

Similar Documents

PublicationPublication DateTitle
US9514352B2 (en)Fingerprint enrollment using touch sensor data
US9646193B2 (en)Fingerprint authentication using touch sensor data
CN109711255A (en)Fingerprint acquisition method and related device
US9280695B2 (en)Apparatus and method for determining sequencing of fingers in images to a two-finger scanner of fingerprint images
CN106020436A (en)Image analyzing apparatus and image analyzing method
WO2018077104A1 (en)Fingerprint acquisition method, fingerprint acquisition device, and storage medium
WO2003077199A1 (en)Fingerprint matching device, fingerprint matching method, and program
JP2016091457A (en)Input device, fingertip-position detection method, and computer program for fingertip-position detection
TWI737061B (en) Method for determining imaging ratio of curved screen, storage medium, and electronic equipment
US20170091521A1 (en)Secure visual feedback for fingerprint sensing
CN106485190A (en)Fingerprint register method and device
CN107590435A (en) Palmprint recognition method, device and terminal equipment
CN109669651B (en)Display method of user interface and electronic device
US20140015950A1 (en)Touch detection apparatus, touch detection method and recording medium
JP2013149228A (en)Position detector and position detection program
CN105844265B (en)Fingerprint image processing method and device
CN110516520A (en) Fingerprint registration method and electronic device thereof
US10990787B2 (en)Enrolment of a fingerprint template
US10713463B2 (en)Display method of user interface and electronic apparatus thereof
KR101542671B1 (en)Method and apparatus for space touch
JP2018049498A (en) Image processing apparatus, operation detection method, computer program, and storage medium
KR101216537B1 (en)Contactless Sign Pad
CN107589834B (en) Terminal equipment operation method and device, terminal equipment
JP7648157B2 (en) Cattle evaluation device and cattle evaluation method
JP5299632B2 (en) Personal authentication device, program and recording medium

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp