Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In subsequent description, it is only using the suffix for indicating such as " module ", " component " or " unit " of elementBe conducive to explanation of the invention, itself there is no a specific meaning.Therefore, " module ", " component " or " unit " can mixGround uses.
Terminal can be implemented in a variety of manners.For example, terminal described in the present invention may include such as mobile phone, plateComputer, laptop, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portableMedia player (Portable Media Player, PMP), navigation device, wearable device, Intelligent bracelet, pedometer etc. moveThe fixed terminals such as dynamic terminal, and number TV, desktop computer.
It will be illustrated by taking mobile terminal as an example in subsequent descriptions, it will be appreciated by those skilled in the art that in addition to specialExcept element for moving purpose, the construction of embodiment according to the present invention can also apply to the terminal of fixed type.
Referring to Fig. 1, a kind of hardware structural diagram of its mobile terminal of each embodiment to realize the present invention, the shiftingDynamic terminal 100 may include: RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit103, A/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit108, the components such as memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that shown in Fig. 1Mobile terminal structure does not constitute the restriction to mobile terminal, and mobile terminal may include components more more or fewer than diagram,Perhaps certain components or different component layouts are combined.
It is specifically introduced below with reference to all parts of the Fig. 1 to mobile terminal:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, signal sends and receivees, specifically, by base stationDownlink information receive after, to processor 110 handle;In addition, the data of uplink are sent to base station.In general, radio frequency unit 101Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, penetratingFrequency unit 101 can also be communicated with network and other equipment by wireless communication.Any communication can be used in above-mentioned wireless communicationStandard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunicationsSystem), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (CodeDivision Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code DivisionMultiple Access, wideband code division multiple access), TD-SCDMA (Time Division-Synchronous CodeDivision Multiple Access, TD SDMA), FDD-LTE (Frequency DivisionDuplexing-Long Term Evolution, frequency division duplex long term evolution) and TDD-LTE (Time DivisionDuplexing-Long Term Evolution, time division duplex long term evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user to receive and dispatch electricity by WiFi module 102Sub- mail, browsing webpage and access streaming video etc., it provides wireless broadband internet access for user.Although Fig. 1 showsGo out WiFi module 102, but it is understood that, and it is not belonging to must be configured into for mobile terminal, it completely can be according to needIt to omit within the scope of not changing the essence of the invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 100When under the isotypes such as formula, speech recognition mode, broadcast reception mode, by radio frequency unit 101 or WiFi module 102 it is received orThe audio data stored in memory 109 is converted into audio signal and exports to be sound.Moreover, audio output unit 103Audio output relevant to the specific function that mobile terminal 100 executes can also be provided (for example, call signal receives sound, disappearsBreath receives sound etc.).Audio output unit 103 may include loudspeaker, buzzer etc..
A/V input unit 104 is for receiving audio or video signal.A/V input unit 104 may include graphics processor(Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition modeOr the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries outReason.Treated, and picture frame may be displayed on display unit 106.Through graphics processor 1041, treated that picture frame can be depositedStorage is sent in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.MikeWind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042Quiet down sound (audio data), and can be audio data by such acoustic processing.Audio that treated (voice) data canTo be converted to the format output that can be sent to mobile communication base station via radio frequency unit 101 in the case where telephone calling model.Microphone 1042 can be implemented various types of noises elimination (or inhibition) algorithms and send and receive sound to eliminate (or inhibition)The noise generated during frequency signal or interference.
Mobile terminal 100 further includes at least one sensor 105, such as optical sensor, motion sensor and other biographiesSensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environmentThe light and shade of light adjusts the brightness of display panel 1061, and proximity sensor can close when mobile terminal 100 is moved in one's earDisplay panel 1061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (generalFor three axis) size of acceleration, it can detect that size and the direction of gravity when static, can be used to identify the application of mobile phone posture(such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) etc.;The fingerprint sensor that can also configure as mobile phone, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer,The other sensors such as hygrometer, thermometer, infrared sensor, details are not described herein.
Display unit 106 is for showing information input by user or being supplied to the information of user.Display unit 106 can wrapDisplay panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be usedForms such as (Organic Light-Emitting Diode, OLED) configure display panel 1061.
User input unit 107 can be used for receiving the number or character information of input, and generate the use with mobile terminalFamily setting and the related key signals input of function control.Specifically, user input unit 107 may include touch panel 1071 withAnd other input equipments 1072.Touch panel 1071, also referred to as touch screen collect the touch operation of user on it or nearby(for example user uses any suitable objects or attachment such as finger, stylus on touch panel 1071 or in touch panel 1071Neighbouring operation), and corresponding attachment device is driven according to preset formula.Touch panel 1071 may include touch detectionTwo parts of device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch operation bandThe signal come, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and by itIt is converted into contact coordinate, then gives processor 110, and order that processor 110 is sent can be received and executed.In addition, canTo realize touch panel 1071 using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves.In addition to touch panel1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can wrapIt includes but is not limited in physical keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse, operating stick etc.It is one or more, specifically herein without limitation.
Further, touch panel 1071 can cover display panel 1061, when touch panel 1071 detect on it orAfter neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thingThe type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, touch panel 1071 and display panel1061 be the function that outputs and inputs of realizing mobile terminal as two independent components, but in certain embodiments, it canThe function that outputs and inputs of mobile terminal is realized so that touch panel 1071 and display panel 1061 is integrated, is not done herein specificallyIt limits.
Interface unit 108 be used as at least one external device (ED) connect with mobile terminal 100 can by interface.For example,External device (ED) may include wired or wireless headphone port, external power supply (or battery charger) port, wired or nothingLine data port, memory card port, the port for connecting the device with identification module, audio input/output (I/O) endMouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, numberIt is believed that breath, electric power etc.) and the input received is transferred to one or more elements in mobile terminal 100 or can be withFor transmitting data between mobile terminal 100 and external device (ED).
Memory 109 can be used for storing software program and various data.Memory 109 can mainly include storing program areaThe storage data area and, wherein storing program area can (such as the sound of application program needed for storage program area, at least one functionSound playing function, image player function etc.) etc.;Storage data area can store according to mobile phone use created data (such asAudio data, phone directory etc.) etc..In addition, memory 109 may include high-speed random access memory, it can also include non-easyThe property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connectionA part by running or execute the software program and/or module that are stored in memory 109, and calls and is stored in storageData in device 109 execute the various functions and processing data of mobile terminal, to carry out integral monitoring to mobile terminal.PlaceManaging device 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulatedemodulate is mediatedManage device, wherein the main processing operation system of application processor, user interface and application program etc., modem processor is mainProcessing wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Mobile terminal 100 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply 111Can be logically contiguous by power-supply management system and processor 110, to realize management charging by power-supply management system, putThe functions such as electricity and power managed.
Although Fig. 1 is not shown, mobile terminal 100 can also be including bluetooth module etc., and details are not described herein.
Embodiment to facilitate the understanding of the present invention, the communications network system that mobile terminal of the invention is based below intoRow description.
Referring to Fig. 2, Fig. 2 is a kind of communications network system architecture diagram provided in an embodiment of the present invention, the communication network systemSystem is the LTE system of universal mobile communications technology, which includes UE (User Equipment, the use of successively communication connectionFamily equipment) (the land Evolved UMTS Terrestrial Radio Access Network, evolved UMTS 201, E-UTRANGround wireless access network) 202, EPC (Evolved Packet Core, evolved packet-based core networks) 203 and operator IP operation204。
Specifically, UE201 can be above-mentioned terminal 100, and details are not described herein again.
E-UTRAN202 includes eNodeB2021 and other eNodeB2022 etc..Wherein, eNodeB2021 can be by returningJourney (backhaul) (such as X2 interface) is connect with other eNodeB2022, and eNodeB2021 is connected to EPC203,ENodeB2021 can provide the access of UE201 to EPC203.
EPC203 may include MME (Mobility Management Entity, mobility management entity) 2031, HSS(Home Subscriber Server, home subscriber server) 2032, other MME2033, SGW (Serving Gate Way,Gateway) 2034, PGW (PDN Gate Way, grouped data network gateway) 2035 and PCRF (Policy andCharging Rules Function, policy and rate functional entity) 2036 etc..Wherein, MME2031 be processing UE201 andThe control node of signaling, provides carrying and connection management between EPC203.HSS2032 is all to manage for providing some registersSuch as the function of home location register (not shown) etc, and preserves some related service features, data rates etc. and useThe dedicated information in family.All customer data can be sent by SGW2034, and PGW2035 can provide the IP of UE 201Address distribution and other functions, PCRF2036 are strategy and the charging control strategic decision-making of business data flow and IP bearing resourcePoint, it selects and provides available strategy and charging control decision with charge execution function unit (not shown) for strategy.
IP operation 204 may include internet, Intranet, IMS (IP Multimedia Subsystem, IP multimediaSystem) or other IP operations etc..
Although above-mentioned be described by taking LTE system as an example, those skilled in the art should know the present invention is not onlySuitable for LTE system, be readily applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA withAnd the following new network system etc., herein without limitation.
Based on above-mentioned mobile terminal hardware configuration and communications network system, each embodiment of the method for the present invention is proposed.
First embodiment
In order to solve in moving region in the prior art and non-moving areas method of determination, effect is bad, leads to synthesisPhoto generates ghost, and the low problem of satisfaction of users, the present embodiment provides a kind of image-regions to determine method, is applied to eventuallyEnd, wherein the terminal of the present embodiment description can be terminal as shown in Figure 1, it is of course also possible to be other terminals.Specifically,It may refer to shown in Fig. 3, Fig. 3 is that image-region provided in this embodiment determines method basic flow chart, which determinesMethod includes:
S301, after receiving the photographing instruction shoots N frame image for same photographed scene.
In the present embodiment, after receiving the photographing instruction, N frame image is shot for same photographed scene, wherein N is greater thanEqual to 2, for example, N can be 2,3,4 etc..That is, after receiving the photographing instruction, for same photographed scene shoot toFew 2 frame images.Wherein, the occurrence of N can by user and/or terminal development personnel flexible setting according to actual needs, for example,It is set as 5,6,7 etc..The occurrence of N can also be determined according to preset time and camera time for exposure, wherein when N=is presetBetween/time for exposure, it should be noted that preset time user and/or terminal development personnel flexible setting according to actual needs, exampleSuch as, after receiving the photographing instruction, shooting time is too long can allow user experience bad, and therefore, preset time can be set to 1Second, 2 seconds etc..For the time for exposure, can by terminal development personnel and/or user flexible setting according to actual needs, for example,It is set as 0.2 second, 0.3 second etc..Time for exposure can also be adjusted according to the ambient brightness, and ambient brightness is lower, exposureTime is longer.Assuming that the time for exposure be it is fixed, be set as 0.1 second, preset time be 1 second, then N be 10, i.e., receiving clapAfter instruction, 10 frame images are shot for same photographed scene.
In the present embodiment, when shooting N frame image for same photographed scene, same bat can be directed to by a cameraTake the photograph scene capture N frame image.When terminal the same face is provided with 2 or 2 or more cameras, this 2 or 2 can also be passed throughThe above camera shoots N frame image for same photographed scene.
In the present embodiment, before for same photographed scene shooting N frame image, it can also judge that ambient brightness isIt is no to be less than predetermined luminance threshold value, if so, shooting N frame image for same photographed scene;If it is not, then terminating.Wherein it is possible to logicalCross ambient light sensor etc. and obtain ambient brightness, predetermined luminance threshold value can by user and/or terminal development personnel according toFlexible setting is actually needed.That is, only receiving finger of taking pictures when ambient brightness is less than predetermined luminance threshold valueAfter order, just N frame image can be shot for same photographed scene.
In the present embodiment, before for same photographed scene shooting N frame image, also it may determine that current whether in nightScape screening-mode, if so, N frame image is shot for same photographed scene, if it is not, terminating.That is, being only in night sceneWhen screening-mode, after receiving the photographing instruction, just N frame image can be shot for same photographed scene.
In the present embodiment, before for same photographed scene shooting N frame image, also it may determine that current whether in nightScape screening-mode, judges whether ambient brightness is less than predetermined luminance threshold value;If being currently at night scene screening-mode, or surroundingAmbient brightness is less than predetermined luminance threshold value, then shoots N frame image for same photographed scene;If being not presently within night scene shooting mouldFormula and ambient brightness are more than or equal to predetermined luminance threshold value, then terminate.That is, bright less than presetting in ambient brightnessWhen spending threshold value or being currently at night scene screening-mode, just N frame can be shot for same photographed scene after receiving the photographing instructionImage.Wherein it is possible to first judge whether ambient brightness is less than predetermined luminance threshold value, if so, being directed to same photographed sceneShoot N frame image;If it is not, being judged again currently whether in night scene screening-mode, if so, shooting N frame for same photographed sceneImage, if it is not, terminating.Alternatively, can also first judge that judgement is current whether in night scene screening-mode, if so, for samePhotographed scene shoots N frame image;If it is not, whether ambient brightness is less than predetermined luminance threshold value again, if so, being directed to same shootingScene capture N frame image, if it is not, terminating.It is of course also possible to judge whether ambient brightness is less than predetermined luminance threshold simultaneouslyValue, whether judgement is current is in night scene screening-mode, is less than predetermined luminance threshold value in ambient brightness or is currently at night sceneWhen screening-mode, N frame image is shot for same photographed scene;It is being not presently within night scene screening-mode and ambient brightnessMore than or equal to predetermined luminance threshold value, then terminate.
S302, select M frame image as to reference picture from N frame image.
In the present embodiment, to same photographed scene shoot N frame image after, selected from N frame image M frame image as toReference picture, wherein M is more than or equal to 2.It should be understood that due to being to select M frame image from N frame image, M shouldLess than or equal to N.For example, M can be equal to N, i.e., the image of shooting is used as to reference picture.M may be N-1, i.e. M=N-1.Certainly, M can also be other values, specifically, can flexible setting according to actual needs.
S303, reference picture is obtained after the data of pixel corresponding in reference picture are averaged to M frame.
It in the present embodiment, is selecting after reference picture, the data respectively to corresponding pixel points in reference picture is being carried outReference picture is obtained after average, for example, it is assumed that M is 2, each frame image all includes 3 pixels, then will be in first frame pictureThe data of first pixel, the data of the first pixel in the second frame picture are averagely obtained first picture in reference pictureThe data of vegetarian refreshments, by the data of the second pixel in the data of the second pixel in first frame picture, the second frame picture intoRow averagely obtains the data of second pixel point in reference picture, by the data of the third pixel in first frame picture, theThe data of third pixel in two frame pictures are averagely obtained the data of third pixel in reference picture, to be joinedExamine image.Wherein, the data of pixel include color, brightness etc..
S304, moving region and non-moving areas are confirmed according to reference picture.
In the present embodiment, after obtaining reference picture, operation area and non-moving areas are confirmed according to reference picture.Wherein it is possible to compare the image and reference picture that each frame is shot to confirm moving region and non-moving areas.OrPerson confirms that moving region and non-moving areas are also possible to according to reference picture referring to fig. 4:
S401, select a frame image as benchmark image from N frame image.
In the present embodiment, it can arbitrarily select a frame image as benchmark image from N frame image, also can choose firstFrame image also can choose nth frame image as benchmark image as benchmark image, it should be noted that according to shooting whenBetween, first frame image is the earliest image (image shot at first in N frame image) of shooting time, and nth frame image is shootingThe image (image finally shot in N frame image) of time the latest.It, can be unselected from N frame image in the present embodimentSelect a frame image as benchmark image for the image kind to reference picture;It is of course also possible to select one to reference pictureFrame image is as benchmark image.
S402, benchmark image is compared with reference picture to confirm moving region and non-moving areas.
In the present embodiment, benchmark image and reference picture are compared to confirm moving region and non-moving areas.Specifically, benchmark image can be compared with reference picture, difference is greater than the region of preset threshold as moving region,Correspondingly, difference is less than or equal to the region of preset threshold as non-moving areas.Wherein, preset threshold can be according to practical needWant flexible setting.
In the present embodiment, after confirming moving region and non-moving areas according to reference picture, it is also based on determinationMoving region and non-moving areas out obtains target image to N frame image progress multiframe noise reduction process to get shooting to the endObtained image.When carrying out multiframe noise reduction process, the non-moving areas in N frame image can be merged, for movementRegion then selects in N frame image the data of the moving region in a wherein frame image to be retained.After obtaining target image,It saves target image and is shown.
Image-region provided in this embodiment determines method, by after receiving the photographing instruction, for same shooting fieldScape shoots N frame image, and wherein N is more than or equal to 2, then, selects M frame image as to reference picture from N frame image, wherein MMore than or equal to 2, reference picture is obtained after the data of pixel corresponding in reference picture are averaged to M frame, then basisReference picture confirms moving region and non-moving areas, improves the accuracy of moving region and non-moving areas determination, makesWhen subsequent must carry out multiframe noise reduction based on moving region and non-moving areas, ghost is largely eliminated, user is improvedExperience satisfaction.
Second embodiment
In order to better understand the present invention, the present embodiment combines more specifical example to be illustrated, and refers to Fig. 5 instituteShow, Fig. 5 is the refined flow chart that the image-region that second embodiment of the invention provides determines method, which determines methodInclude:
S501, photographing instruction is received.
In the present embodiment, after starting camera, photographing instruction is received.
S502, judge whether ambient brightness is less than predetermined luminance value.
If so, turning S503, if it is not, terminating.
In the present embodiment, after receiving the photographing instruction, ambient brightness is obtained by ambient light sensor, and judgeWhether ambient brightness is less than predetermined luminance value.Wherein, predetermined luminance value can by user and/or terminal development personnel according toFlexible setting is actually needed.When ambient brightness is less than predetermined luminance value, turn S503;It is more than or equal in ambient brightnessWhen predetermined luminance value, terminate.
S503, N frame image is shot for same photographed scene.
In the present embodiment, when determining that ambient brightness is less than predetermined luminance value, N frame is shot for same photographed sceneImage, wherein N is more than or equal to 2.
For a better understanding, in the present embodiment, it is assumed that N 5, referring to Fig. 6, Fig. 6 is to shoot for same photographed scene5 frame images, including first frame image 601, the second frame image 602, third frame image 603, the 4th frame image 604, the 5th frameImage 605.
S504, select M frame image as to reference picture from N frame image.
In the present embodiment, for Same Scene shooting N frame image after, selected from N frame image M frame image as toReference picture, wherein M=N-1.Wherein it is possible to select the M frame image of shooting time rearward, that is, select the second-nth frame image.Certainly, in other embodiments, it is also an option that the forward M frame image of shooting time is used as to reference picture.
In the present embodiment, example in undertaking selects the M frame image of shooting time rearward, that is, selects the second frame image, third frameImage, the 4th frame image, the 5th frame image are used as to reference picture.
S505, reference picture is obtained after the data of corresponding pixel points in reference picture are averaged to the M frame of selection.
In the present embodiment, after selection is to reference picture, the number of corresponding pixel points in reference picture is waited for the M frame of selectionAccording to carry out it is average after obtain reference picture, wherein the data of pixel include the color of the pixel, brightness.
Example in undertaking, referring to Fig. 7, to the second frame image 602, third frame image 603, the 4th frame image 604, the 5th frame figureReference picture 606 is averagely obtained as 605.
S506, select a frame image as benchmark image from N frame image.
In the present embodiment, from be not selected for selecting in the image to reference picture a frame image as benchmark image, byIt is the image of shooting time rearward in what is selected to reference picture, therefore, benchmark image is first frame image.
Example in undertaking selects first frame image 601 as benchmark image.
S507, reference picture and benchmark image are compared.
Difference is less than or equal to the area of preset threshold as moving region by S508, the region that difference is greater than to preset thresholdDomain is as non-moving areas.
In the present embodiment, by reference picture and benchmark image, difference is greater than the region of preset threshold as moving region,Difference is less than or equal to the region of preset threshold as non-moving areas, wherein preset threshold can according to actual needs flexiblySetting.
Example in undertaking, referring to Fig. 8, after benchmark image (i.e. first frame image 601) is compared with reference picture 606,It determines in N frame picture, the position of moving region 801 and non-moving areas 802.
S509, target image is obtained to N frame image progress multiframe noise reduction process based on moving region and non-moving areas.
It, will be non-athletic in N photograph frame when carrying out multiframe noise reduction to moving region and non-moving areas in the present embodimentRegion is merged, and the moving region of first frame image is retained.It certainly, in other embodiments, can also be by other fusion sidesMethod.
Example in undertaking, target image 901 are shown in Figure 9.
S510, target image is saved.
Image-region provided in this embodiment determines method, by after receiving the photographing instruction, for same shooting fieldScape shoots N frame image, and wherein N is more than or equal to 2, then, selects M frame image as to reference picture from N frame image, wherein MEqual to N-1, reference picture is obtained after the data of pixel corresponding in reference picture are averaged to M frame, will be quilt thenThe image to reference picture is selected as benchmark image, benchmark image is compared with reference picture and determines moving regionAnd non-moving areas, due to being to be determined according to all images of shooting when determining moving region and non-moving areas, accordingly, it is determined that the moving area gone out is more accurate with non-moving areas, there is ghost when reducing subsequent progress multiframe noise reductionProbability, satisfaction that the user experience is improved.
3rd embodiment
In order to better understand the present invention, the present embodiment combines more specifical example to be illustrated, referring to Figure 10 instituteShow, Figure 10 is the refined flow chart that the image-region that second embodiment of the invention provides determines method, the image-region determination sideMethod includes:
S1001, photographing instruction is received.
In the present embodiment, after starting camera, photographing instruction is received.
Whether S1002, judgement are current in night scene screening-mode.
If so, turning S1003, if it is not, terminating.
In the present embodiment, after receiving the photographing instruction, whether judgement is current is in night scene screening-mode, is being currently atWhen night scene screening-mode, turn S1003, when currently without night scene screening-mode is in, terminates.
S1003, N frame image is shot for same photographed scene.
In the present embodiment, when determining that ambient brightness is less than predetermined luminance value, N frame is shot for same photographed sceneImage, wherein N is more than or equal to 2.
For a better understanding, in the present embodiment, it is assumed that N 3, for 3 frame images of same photographed scene shooting.
S1004, select M frame image as to reference picture from N frame image.
In the present embodiment, for Same Scene shooting N frame image after, selected from N frame image M frame image as toReference picture, wherein M=N.That is, using the image of all shootings as to reference picture.
3 frame images of shooting are used as to reference picture by example in undertaking.
S1005, reference picture is obtained after the data of corresponding pixel points in reference picture are averaged to the M frame of selection.
In the present embodiment, after selection is to reference picture, the number of corresponding pixel points in reference picture is waited for the M frame of selectionAccording to carry out it is average after obtain reference picture, wherein the data of pixel include the color of the pixel, brightness.
3 frame images are averagely obtained reference picture by example in undertaking.
S1006, select a frame image as benchmark image from N frame image.
In the present embodiment, select nth frame image (image finally shot in N frame image) as benchmark image.Certainly,In other embodiments, it also can choose first frame image (image shot at first in N frame image) as benchmark image.
Example in undertaking selects third frame frame image as benchmark image.
S1007, reference picture and benchmark image are compared.
Difference is less than or equal to the area of preset threshold as moving region by S1008, the region that difference is greater than to preset thresholdDomain is as non-moving areas.
In the present embodiment, by reference picture and benchmark image, difference is greater than the region of preset threshold as moving region,Difference is less than or equal to the region of preset threshold as non-moving areas, wherein preset threshold can according to actual needs flexiblySetting.
S1009, target image is obtained to N frame image progress multiframe noise reduction process based on moving region and non-moving areas.
It, will be non-athletic in N photograph frame when carrying out multiframe noise reduction to moving region and non-moving areas in the present embodimentRegion is merged, and the moving region of nth frame image is retained.It certainly, in other embodiments, can also be by other fusion sidesMethod.
Terminal provided in this embodiment and computer readable storage medium, by after receiving the photographing instruction, for sameOne photographed scene shoots N frame image, and wherein N is more than or equal to 2, then, selects M frame image as to reference to figure from N frame imagePicture, wherein M is more than or equal to 2, obtains reference picture after the data of pixel corresponding in reference picture are averaged to M frame,Then moving region and non-moving areas are confirmed according to reference picture, improves the standard of moving region and non-moving areas determinationExactness, reduces subsequent when carrying out multiframe noise reduction based on moving region and non-moving areas, the probability of ghost occurs, improves useExperience satisfaction in family.
Fourth embodiment
It is shown in Figure 11 the present embodiment provides a kind of terminal, terminal provided in this embodiment include processor 1101,Memory 1102 and communication bus 1103.
Wherein, the communication bus 1103 in the present embodiment is for realizing the connection between processor 1101 and memory 1102Communication;
Processor 1101 is then for executing one or more program stored in memory 1102, to realize the present embodimentA kind of terminal is provided, shown in Figure 11, terminal provided in this embodiment includes processor 1101, memory 1102 and communicationBus 1103.
Wherein, the communication bus 1103 in the present embodiment is for realizing the connection between processor 1101 and memory 1102Communication;
Processor 1101 is then for executing one or more program stored in memory 1102, to realize following stepIt is rapid:
After receiving the photographing instruction, N frame image is shot for same photographed scene;
Select M frame image as to reference picture from N frame image;
Reference picture is obtained after the data of pixel corresponding in reference picture are averaged to M frame;
Moving region and non-moving areas are confirmed according to reference picture.
In the present embodiment, N is more than or equal to 2, for example, N can be 2,3,4 etc..The occurrence of N can be by user and/or endDeveloper's flexible setting according to actual needs is held, for example, being set as 5,6,7 etc..The occurrence of N can also be according to preset timeWith the camera time for exposure determine, wherein N=preset time/time for exposure, it should be noted that preset time user and/orTerminal development personnel flexible setting according to actual needs.
In the present embodiment, when shooting N frame image for same photographed scene, same bat can be directed to by a cameraTake the photograph scene capture N frame image.When terminal the same face is provided with 2 or 2 or more cameras, this 2 or 2 can also be passed throughThe above camera shoots N frame image for same photographed scene.
In the present embodiment, M is more than or equal to 2.It should be understood that due to being to select M frame image from N frame image,M should be less than or equal to N.For example, M can be equal to N, i.e., the image of shooting is used as to reference picture.M may be N-1, i.e.,M=N-1.Certainly, M can also be other values, specifically, can flexible setting according to actual needs.
In the present embodiment, moving region is confirmed according to reference picture and non-moving areas may include: from N frame imageIt selects a frame image as benchmark image, benchmark image is compared with reference picture to confirm moving region and non-athleticRegion.Wherein, the region that difference is less than or equal to preset threshold is made as moving region in the region for difference being greater than preset thresholdFor non-moving areas, preset threshold can flexible setting according to actual needs.
In the present embodiment, before for same photographed scene shooting N frame image, processor 1101 is also used to execute storageOne or more program stored in device 1102, with perform the steps of judge ambient brightness whether be less than preset it is brightThreshold value is spent, if so, shooting N frame image for same photographed scene;If it is not, then terminating.
In the present embodiment, before for same photographed scene shooting N frame image, processor 1101 is also used to execute storageOne or more program stored in device 1102, it is current whether in night scene screening-mode to perform the steps of judgement, ifIt is to shoot N frame image for same photographed scene, if it is not, terminating.
In the present embodiment, after confirming moving region and non-moving areas according to reference picture, processor 1101 is also usedIn executing one or more program for storing in memory 1102, to perform the steps of in the moving region determined andNon-moving areas carries out multiframe noise reduction process to N frame image and obtains target image to get the image shot to the end.
It is worth noting that, not fully expounding first embodiment, second in the present embodiment in fact in order not to burden explanationApply example, all examples in 3rd embodiment, it is understood that, first embodiment, second embodiment, in 3rd embodimentAll examples are suitable for the present embodiment.
The present embodiment also provides a kind of computer readable storage medium, as floppy disk, CD, hard disk, flash memory, USB flash disk, CF card,SD card, mmc card etc. are stored with one or more program for realizing above-mentioned each step in the computer storage medium, thisOne or more program can be executed by one or more processor, with realize above-mentioned first embodiment, second embodiment andImage-region described in any embodiment determines each step of method in 3rd embodiment.
Terminal provided in this embodiment and computer readable storage medium, by after receiving the photographing instruction, for sameOne photographed scene shoots N frame image, and wherein N is more than or equal to 2, then, selects M frame image as to reference to figure from N frame imagePicture, wherein M is more than or equal to 2, obtains reference picture after the data of pixel corresponding in reference picture are averaged to M frame,Then moving region and non-moving areas are confirmed according to reference picture, improves the standard of moving region and non-moving areas determinationExactness, reduces subsequent when carrying out multiframe noise reduction based on moving region and non-moving areas, the probability of ghost occurs, improves useExperience satisfaction in family.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-rowHis property includes, so that the process, method, article or the device that include a series of elements not only include those elements, andAnd further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsicElement.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to doThere is also other identical elements in the process, method of element, article or device.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment sideMethod can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many casesThe former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior artThe part contributed out can be embodied in the form of software products, which is stored in a storage mediumIn (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal (can be mobile phone, computer, serviceDevice, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specificEmbodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the artUnder the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very muchForm, all of these belong to the protection of the present invention.