Movatterモバイル変換


[0]ホーム

URL:


US20220343681A1 - Evaluating method and system for face verification, and computer storage medium - Google Patents

Evaluating method and system for face verification, and computer storage medium
Download PDF

Info

Publication number
US20220343681A1
US20220343681A1US17/744,548US202217744548AUS2022343681A1US 20220343681 A1US20220343681 A1US 20220343681A1US 202217744548 AUS202217744548 AUS 202217744548AUS 2022343681 A1US2022343681 A1US 2022343681A1
Authority
US
United States
Prior art keywords
verification
querying
images
dataset
identifiers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/744,548
Other versions
US12272175B2 (en
Inventor
Yang Zhou
Jie Liu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp LtdfiledCriticalGuangdong Oppo Mobile Telecommunications Corp Ltd
Priority to US17/744,548priorityCriticalpatent/US12272175B2/en
Assigned to GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., LTD.reassignmentGUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., LTD.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: LIU, JIE, ZHOU, YANG
Publication of US20220343681A1publicationCriticalpatent/US20220343681A1/en
Application grantedgrantedCritical
Publication of US12272175B2publicationCriticalpatent/US12272175B2/en
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

An evaluating method and a system for face verification, and a computer storage medium. The method comprises: obtaining a querying dataset and a training dataset, wherein the querying dataset comprises querying images, and the training dataset comprises training images; generating verification pairs and an evaluation list respectively according to the querying dataset and the training dataset; wherein the evaluation list comprises the querying images and the training images; determining feature information based on the evaluation list; establishing correspondences between identifiers and features according to first identifiers of the querying images, second identifiers of the training image, and the feature information; and determining evaluation results of the verification pairs according to the correspondences between the identifiers and the features.

Description

Claims (20)

What is claimed is:
1. An evaluating method for face verification, comprising:
obtaining a querying dataset and a training dataset, wherein the querying dataset comprises querying images, and the training dataset comprises training images;
generating verification pairs and an evaluation list respectively according to the querying dataset and the training dataset; wherein the evaluation list comprises the querying images and the training images;
determining feature information based on the evaluation list;
establishing correspondences between identifiers and features according to first identifiers of the querying images, second identifiers of the training image, and the feature information; and
determining evaluation results of the verification pairs according to the correspondences between the identifiers and the features.
2. The method as claimed inclaim 1, wherein the method, after the obtaining the querying dataset and the training dataset, further comprises:
extracting the querying images of the querying dataset; and
performing a labeling process for the querying images in sequence to generate the first identifiers, wherein each of the querying images corresponds to a corresponding one of the first identifiers.
3. The method as claimed inclaim 1, wherein the method, after the obtaining the querying dataset and the training dataset, further comprises:
extracting the training images of the training dataset; and
performing a labeling process for the training images to generate the second identifiers, wherein each of the training images corresponds to a corresponding one of the second identifiers.
4. The method as claimed inclaim 1, wherein the generating the verification pairs and the evaluation list respectively according to the querying dataset and the training dataset, comprises:
pairing each of the querying image of the querying dataset with at least one of the training images of the training dataset, to generate a verification pair.
5. The method as claimed inclaim 4, wherein a first querying image and a first training image that constitute a first verification pair are not totally same to a second querying image and a second training image that constitute a second verification pair.
6. The method as claimed inclaim 1, wherein the generating the verification pairs and the evaluation list respectively according to the querying dataset and the training dataset, comprises:
constituting the evaluation list according to all of the querying images of the querying dataset and all of the training images of the training dataset.
7. The method as claimed inclaim 6, wherein any two images in the evaluation list are different from each other.
8. The method as claimed inclaim 3, wherein the determining the feature information based on the evaluation list, comprises:
performing a detecting process for the evaluation list, to obtain facial feature maps corresponding to the evaluation list; and
performing a feature-identifying process for the facial feature maps, to obtain feature information corresponding to the facial feature maps.
9. The method as claimed inclaim 8, wherein the performing the detecting process for the evaluation list to obtain the facial feature maps corresponding to the evaluation list, comprises:
inputting all images in the evaluation list into a preset face detection model in sequence, and outputting the facial feature maps; wherein each of the images corresponds to a corresponding one of the facial feature maps.
10. The method as claimed inclaim 9, wherein the performing the feature-identifying process for the facial feature maps to obtain the feature information corresponding to the facial feature maps, comprises:
inputting the facial feature maps to a preset feature recognition model, and outputting the feature information; wherein each of the facial feature maps corresponds to a corresponding group of feature information.
11. The method as claimed inclaim 10, wherein the establishing the correspondences between the identifiers and the features according to the first identifiers of the querying images, the second identifiers of the training image, and the feature information, comprises:
establishing correspondences between the identifiers and the feature maps based on the first identifiers, the second identifiers, and the facial feature maps; and
establishing the correspondences between the identifiers and the features according to the correspondences between the identifiers and the feature maps, and the feature information.
12. The method as claimed inclaim 1, wherein the determining the evaluation results of the verification pairs according to the correspondences between the identifiers and the features, comprises:
performing a face verifying process for the verification pairs according to the correspondences between the identifiers and the features, to generate face-verification results; and
performing an evaluating process for the face-verification results, to obtain the evaluation results.
13. The method as claimed inclaim 12, wherein the performing the face verifying process for the verification pairs according to the correspondences between the identifiers and the features to generate the face-verification results, comprises:
determining a first target identifier and a second target identifier corresponding to a verification pair;
determining first feature information according to the first target identifier and the correspondences between the identifiers and the features;
determining second feature information according to the second target identifier and the correspondences between the identifiers and the features;
generating a face-verification result based on the first feature information and the second feature information.
14. The method as claimed inclaim 13, wherein the generating the face-verification result based on the first feature information and the second feature information comprises:
in response to the first feature information being same to the second feature information, determining the face-verification result being pass; and
in response to the first feature information being different from the second feature information, determining the face-verification result being failure.
15. The method as claimed inclaim 13, wherein the generating the face-verification result based on the first feature information and the second feature information, comprises:
calculating a similarity between the first feature information and the second feature information;
in response to the similarity being larger than or equal to a preset similarity threshold, determining the face-verification result being pass; and
in response to the similarity being less than the preset similarity threshold, determining the face-verification result being failure.
16. The method as claimed inclaim 12, wherein the performing the evaluating process for the face-verification results to obtain the evaluation results, comprises:
obtaining an original verification result corresponding to a verification pair; and
performing the evaluating process for a face-verification result of the verification pair according to the original verification result thereof, to obtain an evaluation result thereof.
17. The method as claimed inclaim 1, further comprising:
determining memory parameters;
processing the verification pairs in batches according to the memory parameters, to obtain at least one batch of the verification pairs; and
storing the at least one batch of the verification pairs in sequence.
18. The method as claimed inclaim 1, wherein the obtaining the querying dataset and the training dataset, comprises:
obtaining a plurality of images by an infrared camera, wherein the images are all gray-scale images; and
determining the query dataset and the training dataset according to the images.
19. An evaluating system, comprising:
a processor and a storage that stores instructions executable by the processor, wherein when the instructions are executed by the processor, the processor performs an evaluating method for face verification, and the method comprises:
obtaining a querying dataset and a training dataset, wherein the querying dataset comprises querying images, and the training dataset comprises training images;
generating verification pairs and an evaluation list respectively according to the querying dataset and the training dataset; wherein the evaluation list comprises the querying images and the training images;
determining feature information based on the evaluation list;
establishing correspondences between identifiers and features according to first identifiers of the querying images, second identifiers of the training image, and the feature information; and
determining evaluation results of the verification pairs according to the correspondences between the identifiers and the features.
20. A non-transitory computer storage medium, having programs stored therein, applied in an evaluating system, wherein the programs are executed by a processor, to perform an evaluating method for face verification, and the method comprises:
obtaining a querying dataset and a training dataset, wherein the querying dataset comprises querying images, and the training dataset comprises training images;
generating verification pairs and an evaluation list respectively according to the querying dataset and the training dataset; wherein the evaluation list comprises the querying images and the training images;
determining feature information based on the evaluation list;
establishing correspondences between identifiers and features according to first identifiers of the querying images, second identifiers of the training image, and the feature information; and
determining evaluation results of the verification pairs according to the correspondences between the identifiers and the features.
US17/744,5482019-11-202022-05-13Evaluating method and system for face verification, and computer storage mediumActive2041-11-25US12272175B2 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US17/744,548US12272175B2 (en)2019-11-202022-05-13Evaluating method and system for face verification, and computer storage medium

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
US201962938250P2019-11-202019-11-20
PCT/CN2020/130070WO2021098772A1 (en)2019-11-202020-11-19Assessment method and system for facial verification, and computer storage medium
US17/744,548US12272175B2 (en)2019-11-202022-05-13Evaluating method and system for face verification, and computer storage medium

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/CN2020/130070ContinuationWO2021098772A1 (en)2019-11-202020-11-19Assessment method and system for facial verification, and computer storage medium

Publications (2)

Publication NumberPublication Date
US20220343681A1true US20220343681A1 (en)2022-10-27
US12272175B2 US12272175B2 (en)2025-04-08

Family

ID=75981034

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US17/744,548Active2041-11-25US12272175B2 (en)2019-11-202022-05-13Evaluating method and system for face verification, and computer storage medium

Country Status (2)

CountryLink
US (1)US12272175B2 (en)
WO (1)WO2021098772A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
TWI856773B (en)*2022-12-292024-09-21華晶科技股份有限公司Auto framing method and related camera apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060093208A1 (en)*2004-10-292006-05-04Fayin LiOpen set recognition using transduction
US20120320181A1 (en)*2011-06-162012-12-20Samsung Electronics Co., Ltd.Apparatus and method for security using authentication of face
US20170140212A1 (en)*2015-11-162017-05-18MorphoTrak, LLCFacial Matching System
US20200285896A1 (en)*2019-03-092020-09-10Tongji UniversityMethod for person re-identification based on deep model with multi-loss fusion training strategy

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN106446754A (en)*2015-08-112017-02-22阿里巴巴集团控股有限公司Image identification method, metric learning method, image source identification method and devices

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060093208A1 (en)*2004-10-292006-05-04Fayin LiOpen set recognition using transduction
US20120320181A1 (en)*2011-06-162012-12-20Samsung Electronics Co., Ltd.Apparatus and method for security using authentication of face
US20170140212A1 (en)*2015-11-162017-05-18MorphoTrak, LLCFacial Matching System
US20200285896A1 (en)*2019-03-092020-09-10Tongji UniversityMethod for person re-identification based on deep model with multi-loss fusion training strategy

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
TWI856773B (en)*2022-12-292024-09-21華晶科技股份有限公司Auto framing method and related camera apparatus
US12219238B2 (en)2022-12-292025-02-04Altek CorporationAuto framing method and related camera apparatus

Also Published As

Publication numberPublication date
WO2021098772A1 (en)2021-05-27
US12272175B2 (en)2025-04-08

Similar Documents

PublicationPublication DateTitle
CN107633204B (en)Face occlusion detection method, apparatus and storage medium
CN112784741A (en)Pet identity recognition method and device and nonvolatile storage medium
CN108038176B (en)Method and device for establishing passerby library, electronic equipment and medium
US20120320181A1 (en)Apparatus and method for security using authentication of face
CN110751022A (en)Urban pet activity track monitoring method based on image recognition and related equipment
CN113591921B (en)Image recognition method and device, electronic equipment and storage medium
WO2019033574A1 (en)Electronic device, dynamic video face recognition method and system, and storage medium
CN110688878B (en)Living body identification detection method, living body identification detection device, living body identification detection medium, and electronic device
CN102945366A (en)Method and device for face recognition
WO2019033570A1 (en)Lip movement analysis method, apparatus and storage medium
CN112052731B (en)Intelligent portrait identification card punching attendance system and method
CN108108711B (en)Face control method, electronic device and storage medium
Haji et al.Real time face recognition system (RTFRS)
CN106056083A (en)Information processing method and terminal
TW202042113A (en)Face recognition system, establishing data method for face recognition, and face recognizing method thereof
WO2019062588A1 (en)Information recognition method and apparatus, and electronic device
CN115019364A (en) Identity authentication method, device, electronic device and medium based on face recognition
CN113837006A (en)Face recognition method and device, storage medium and electronic equipment
US12272175B2 (en)Evaluating method and system for face verification, and computer storage medium
CN110175500B (en)Finger vein comparison method, device, computer equipment and storage medium
CN110348386B (en) A face image recognition method, device and equipment based on fuzzy theory
CN113065010B (en) Signage image management method, device, computer equipment and storage medium
Kumar et al.Smart face recognition using IoT and machine learning
Marutotamtama et al.Face Recognition and Face Spoofing Detector for Attendance System
CN104778462A (en)Face recognition method and device

Legal Events

DateCodeTitleDescription
FEPPFee payment procedure

Free format text:ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

ASAssignment

Owner name:GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., LTD., CHINA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHOU, YANG;LIU, JIE;SIGNING DATES FROM 20220415 TO 20220417;REEL/FRAME:059962/0515

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:FINAL REJECTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:ADVISORY ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCFInformation on status: patent grant

Free format text:PATENTED CASE


[8]ページ先頭

©2009-2025 Movatter.jp