Movatterモバイル変換


[0]ホーム

URL:


US7283666B2 - Digital image exposure correction - Google Patents

Digital image exposure correction
Download PDF

Info

Publication number
US7283666B2
US7283666B2US10/375,440US37544003AUS7283666B2US 7283666 B2US7283666 B2US 7283666B2US 37544003 AUS37544003 AUS 37544003AUS 7283666 B2US7283666 B2US 7283666B2
Authority
US
United States
Prior art keywords
image
exposure
transformed
transformed image
transforming
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime, expires
Application number
US10/375,440
Other versions
US20040170316A1 (en
Inventor
Suhail S. Saquib
Jay E. Thornton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intellectual Ventures I LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Assigned to POLAROID CORPORATIONreassignmentPOLAROID CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: THORNTON, JAY E., SAQUIB, SUHAIL S.
Priority to US10/375,440priorityCriticalpatent/US7283666B2/en
Priority to PCT/US2004/004964prioritypatent/WO2004077816A2/en
Priority to EP04712899Aprioritypatent/EP1597911A2/en
Priority to JP2005518583Aprioritypatent/JP2006515136A/en
Publication of US20040170316A1publicationCriticalpatent/US20040170316A1/en
Assigned to WILMINGTON TRUST COMPANY, AS COLLATERAL AGENTreassignmentWILMINGTON TRUST COMPANY, AS COLLATERAL AGENTSECURITY AGREEMENTAssignors: PETTERS CONSUMER BRANDS INTERNATIONAL, LLC, PETTERS CONSUMER BRANDS, LLC, POLAROID ASIA PACIFIC LLC, POLAROID CAPITAL LLC, POLAROID CORPORATION, POLAROID EYEWEAR I LLC, POLAROID INTERNATIONAL HOLDING LLC, POLAROID INVESTMENT LLC, POLAROID LATIN AMERICA I CORPORATION, POLAROID NEW BEDFORD REAL ESTATE LLC, POLAROID NORWOOD REAL ESTATE LLC, POLAROID WALTHAM REAL ESTATE LLC, POLAROLD HOLDING COMPANY, ZINK INCORPORATED
Assigned to JPMORGAN CHASE BANK,N.A,AS ADMINISTRATIVE AGENTreassignmentJPMORGAN CHASE BANK,N.A,AS ADMINISTRATIVE AGENTSECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: PETTERS CONSUMER BRANDS INTERNATIONAL, LLC, PETTERS CONSUMER BRANDS, LLC, POLAROID ASIA PACIFIC LLC, POLAROID CAPITAL LLC, POLAROID CORPORATION, POLAROID EYEWEAR ILLC, POLAROID HOLDING COMPANY, POLAROID INTERNATIONAL HOLDING LLC, POLAROID INVESTMENT LLC, POLAROID LATIN AMERICA I CORPORATION, POLAROID NEW BEDFORD REAL ESTATE LLC, POLAROID NORWOOD REAL ESTATE LLC, POLAROID WALTHAM REAL ESTATE LLC, ZINK INCORPORATED
Priority to US11/546,633prioritypatent/US7826660B2/en
Assigned to POLAROID LATIN AMERICA I CORPORATION, POLAROID CAPITAL LLC, POLAROID CORPORATION, PETTERS CONSUMER BRANDS, LLC, ZINK INCORPORATED, POLAROID ASIA PACIFIC LLC, POLAROID HOLDING COMPANY, PETTERS CONSUMER BRANDS INTERNATIONAL, LLC, POLOROID INTERNATIONAL HOLDING LLC, POLAROID INVESTMENT LLC, POLAROID NORWOOD REAL ESTATE LLC, POLAROID EYEWEAR LLC, POLAROID WALTHAM REAL ESTATE LLC, POLAROID NEW BEDFORD REAL ESTATE LLCreassignmentPOLAROID LATIN AMERICA I CORPORATIONRELEASE OF SECURITY INTEREST IN PATENTSAssignors: WILMINGTON TRUST COMPANY
Application grantedgrantedCritical
Publication of US7283666B2publicationCriticalpatent/US7283666B2/en
Assigned to POLAROID CAPITAL LLC, POLAROID NORWOOD REAL ESTATE LLC, POLAROID HOLDING COMPANY, POLAROID CONSUMER ELECTRONICS INTERNATIONAL, LLC, (FORMERLY KNOWN AS PETTERS CONSUMER ELECTRONICS INTERNATIONAL, LLC), POLAROID INVESTMENT LLC, POLAROID CORPORATION, POLAROID WALTHAM REAL ESTATE LLC, POLAROID INTERNATIONAL HOLDING LLC, POLAROID ASIA PACIFIC LLC, POLAROID NEW BEDFORD REAL ESTATE LLC, POLAROID CONSUMER ELECTRONICS, LLC, (FORMERLY KNOWN AS PETTERS CONSUMER ELECTRONICS, LLC), POLAROID LATIN AMERICA I CORPORATION, ZINK INCORPORATED, PLLAROID EYEWEAR I LLCreassignmentPOLAROID CAPITAL LLCRELEASE OF SECURITY INTEREST IN PATENTSAssignors: JPMORGAN CHASE BANK, N.A.
Assigned to SENSHIN CAPITAL, LLCreassignmentSENSHIN CAPITAL, LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: POLAROID CORPORATION
Priority to JP2008213280Aprioritypatent/JP2009005395A/en
Priority to US12/874,809prioritypatent/US8265420B2/en
Assigned to INTELLECTUAL VENTURES I LLCreassignmentINTELLECTUAL VENTURES I LLCMERGER (SEE DOCUMENT FOR DETAILS).Assignors: SENSHIN CAPITAL, LLC
Assigned to MOROOD INTERNATIONAL, SPCreassignmentMOROOD INTERNATIONAL, SPCSECURITY AGREEMENTAssignors: ZINK IMAGING, INC.
Assigned to IKOFIN LTD.reassignmentIKOFIN LTD.SECURITY AGREEMENTAssignors: ZINK IMAGING, INC.
Assigned to MANGROVE III INVESTMENTS SARLreassignmentMANGROVE III INVESTMENTS SARLSECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ZINK IMAGING, INC.
Assigned to LOPEZ, GERARDreassignmentLOPEZ, GERARDSECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ZINK IMAGING, INC.
Adjusted expirationlegal-statusCritical
Expired - Lifetimelegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

Techniques are disclosed for correcting the exposure of a digital image. An exposure predictor may be generated based on a set of images for which ground truth data are known. After identifying an optimal set of features, the exposure of the digital image may be corrected by extracting values of the selected optimal features from the image, using the predictor to predict a desired exposure correction for the image, and correcting the exposure of the image by the predicted desired amount. Exposure correction is based on a model that relates intensity of light in the world to the RGB digits of the digital image. The model comprises a gamma function that models the response of a typical monitor and a S-shaped curve that compresses the large dynamic range of the world to the small dynamic range of the RGB digit space.

Description

BACKGROUND
1. Field of the Invention
The present invention relates to digital image processing and, more particularly, to correcting the exposure of digital images.
2. Related Art
It is essential to properly expose a digital image to obtain a good quality rendition of the original scene on an output device such as a monitor or a printer. The “exposure” of a digital image refers to the quantity of light allowed to act on the image capture sensor; exposure is a product of the intensity (controlled by the aperture and intensity of the illuminant) and the duration (controlled by the shutter speed) of light striking the sensor. Large exposure values will result in brighter images and vice versa. Relying on the original exposure set by the input device (e.g., a digital camera) usually does not yield the best quality for several reasons. For example, a wide variety of picture-taking conditions and scene compositions may make the original exposure quite variable and differ from the preferred exposure. Furthermore, input devices typically have limited dynamic range and therefore err on the side of under-exposing an image to avoid losing information in an image due to clipping. Although underexposed images may appear darker than desired, they tend to retain more information than overexposed images and therefore are amenable to post-acquisition exposure correction to make them more suitable for printing or displaying on an output device.
It is desirable that output devices be equipped to produce properly-exposed renderings from images acquired using a variety of (possibly unknown) image acquisition devices. For example, a desktop digital photo printer or a photo-vending kiosk may be capable of receiving digital images acquired using any of a wide variety of digital cameras, scanners, or other input devices under a wide variety of conditions. It is desirable that such a printer or kiosk be capable of correcting the exposure of any images it receives so that such images may be printed with optimal exposures.
What is needed, therefore, are improved techniques for correcting the exposure of digital images.
SUMMARY
Techniques are disclosed for correcting the exposure of a digital image. An exposure predictor may be generated based on a set of images for which ground truth data are known. An optimal feature set may be identified that strikes a balance between minimizing prediction error and producing good results across a wide range of images. The exposure of an image may be corrected by extracting values of the selected optimal features from the image, using the predictor to predict a desired exposure correction for the image, and correcting the exposure of the image by the predicted desired amount. To facilitate the exposure correction, we propose a model that relates intensity of light in the world to the RGB digits of the digital image. This model comprises a gamma function that models the response of a typical monitor and a S-shaped curve that allows us to compress the large dynamic range of the world to the small dynamic range of the RGB digit space. The exposure of the image may then be corrected by employing the inverse of this model to transform the image to logarithmic intensities in the world, adding or subtracting an offset (given by the desired exposure correction) from the image, and then mapping the image back to the RGB digit space using the above model. For example, in one aspect of the present invention, a method is provided for correcting the exposure of a source image. The method includes steps of: (A) transforming the source image from an image capture space into a nonlinear intensity space to produce a first transformed image; (B) correcting the exposure of the transformed image in the nonlinear intensity space to produce a corrected transformed image; and (C) transforming the corrected transformed image into the image capture space to produce a second transformed image. The step (C) may include steps of: (C)(1) transforming the corrected transformed image into a third transformed image using an S-shaped curve; and (C)(2) transforming the third transformed image into the second transformed image using a gamma function.
If i represents an intensity in the nonlinear intensity space, the step (C) may include a step of transforming the corrected transformed image into the second transformed image using the formula:
T(i)=(A+Btanh(−s(i+o)))1/γ,
the step (A) may transform gray level g in the source image by applying the function T−1(g) to the gray level to produce transformed intensities, and the step (B) may include a step of adding an exposure offset Δe to the transformed intensities to produce corrected transformed intensities.
In another aspect of the present invention, a method is provided for processing an image. The method includes steps of: (A) extracting from the image values of at least one feature selected from a set of features including: a thumbnail of the image, a luminance channel of the image, a region of interest in the image, and a subset of the image including a plurality of pixels satisfying an activity threshold; (B) predicting a desired exposure correction of the image based on the extracted feature values; and (C) correcting the exposure of the image by the predicted exposure correction to produce an exposure-corrected image. The set of features may include other features instead of or in addition to the features just listed.
The region of interest may have the following properties: (1) the average activity within the region is above a predetermined minimum activity threshold; and (2) the absolute logarithm of the ratio of the average luminance of the region to the average luminance of that portion of the image not including the region is the highest such absolute logarithm for a predetermined plurality of regions in the image. The region of interest may have a base size that is proportional to the dimensions of the image, and the dimensions of the region of interest may be proportional to the base size multiplied by a measure of average activity in the image.
In another aspect of the present invention, a method is provided for selecting a set of features for use in a system for adjusting the exposure of images. The method includes steps of: (A) placing a set of features in a master feature set M; (B) initializing a current feature set C to a null value; (C) for each feature F in the master set M, performing steps of: (1) placing the union of the current feature set C and the feature F in a temporary feature set S; (2) computing a leave-n-out error E for a plurality of images using set S as a feature set; (3) if the error E is less than a minimum error EMIN, assigning the value of E to EMINand recording the identity of feature F in a variable FMIN; (D) if EMINis less than a global error EG, assigning the value of EMINto EG, adding the feature F recorded in FMINto the set C, and deleting the feature F recorded in FMINfrom the set M; (E) if the set M is not empty, returning to step (C); and (F) if the set M is empty or the value of EMINis greater than the value of EG, selecting the set C as the set of features for use in the system for adjusting the exposure of images.
Other features and advantages of various aspects and embodiments of the present invention will become apparent from the following description and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1A is a flowchart of a method for correcting the exposure of an image according to one embodiment of the present invention;
FIG. 1B is a flowchart of a method for reducing an image according to one embodiment of the present invention;
FIG. 1C is a flowchart of a method for extracting features from an image according to one embodiment of the present invention;
FIG. 2A is a dataflow diagram illustrating operations performed by the method shown inFIG. 1A;
FIG. 2B is a dataflow diagram illustrating operations performed by the method shown inFIG. 1B;
FIGS. 2C is a dataflow diagram illustrating the operations performed by the method shown inFIG. 1C;
FIG. 3 is a flowchart of a method for identifying a region of interest in an image according to one embodiment of the present invention;
FIG. 4 is a flowchart of a method for extracting features from an image according to one embodiment of the present invention;
FIG. 5 is a flowchart of a method for generating a predictor for predicting desired image exposures according to one embodiment of the present invention;
FIG. 6 is a dataflow diagram illustrating the operations performed by the method shown inFIG. 5;
FIG. 7A is a flowchart of a method for generating ground truth data for a set of ground truth images according to one embodiment of the present invention;
FIG. 7B is a dataflow diagram illustrating the operations performed by the method ofFIG. 7A according to one embodiment of the present invention;
FIG. 7C is a dataflow diagram illustrating the generation of ground truth data for an image in a ground truth set of images according to one embodiment of the present invention;
FIG. 7D is a dataflow diagram illustrating the computation of a prediction error for a test set image according to one embodiment of the present invention;
FIG. 7E is a dataflow diagram illustrating the computation of an average prediction error for a plurality of images in a ground truth image set according to one embodiment of the present invention;
FIG. 8 is a flowchart of a method for computing the average prediction error for a plurality of images in a ground truth image set according to one embodiment of the present invention;
FIG. 9 is a flowchart of a method for selecting an optimal number and combination of features for use in exposure correction according to one embodiment of the present invention;
FIG. 10 is a graph illustrating a family of exposure adjustment curves for use in exposure correction according to one embodiment of the present invention;
FIG. 11 is a flowchart of a method for applying an exposure correction to an image to produce an exposure-corrected image according to one embodiment of the present invention; and
FIG. 12 is a flowchart of a method in which color mapping and exposure correction are integrated according to one embodiment of the present invention.
DETAILED DESCRIPTION
In general, the exposure correction algorithm disclosed herein may be divided into two parts. The first part extracts, from the input image, the values of a set of features that contain the information that is most relevant to the exposure of the image. The second part finds a predictor that operates on the extracted features to generate a predicted exposure correction to apply to the image. The predicted exposure correction is applied to the image to produce an exposure-corrected image. The predictor may, for example, be a linear predictor that is chosen so that the error between the predicted exposure and desired exposure of images is minimized in a least square sense. Since the theory for generating the best linear predictor is well known in the statistical and signal processing arts, the disclosure herein will emphasize techniques both for identifying good features that correlate very well with the desired exposure and for determining an optimal feature set that yields the best linear predictor given the ground truth data.
Referring toFIG. 1A, a flowchart is shown of amethod100 for correcting the exposure of an image according to one embodiment of the present invention. Referring toFIG. 2A, a dataflow diagram200 is shown which illustrates the operations performed by themethod100 shown inFIG. 1A.
Themethod100 operates on an input image202 (FIG. 2A), which may be received from any of a variety of sources such as a digital camera or scanner. Theinput image202 may be represented in any of a variety of formats for representing digital images, such as the JPEG format.
Themethod100 extracts from theimage202 the values208 of a set of selected features218 (step106). The selected features218 may, for example, be identifiers or other descriptors which identify the particular features to be extracted instep106. Examples of features that may be extracted instep106, and examples of techniques for extracting them, will be described below with respect toFIG. 1B,FIG. 1C,FIG. 2C,FIG. 3, andFIG. 4. Techniques that may be used to select the set offeatures218 will be described below with respect toFIG. 9.
Themethod100 generates apredictor216 based onground truth data210 for a set of ground truth images (step108). Note thatstep108 need not be performed each time themethod100 is performed. Rather, thepredictor216 may be generated once, prior to execution of themethod100. The generatedpredictor216 may then be used each time themethod100 is performed, without the need to performstep108. Techniques that may be used to generate thepredictor216 will be described below with respect toFIGS. 5-9. Themethod100 uses thepredictor216 to generate a predicted exposure offset212 based on the extracted feature values208 (step110).
Themethod100 corrects the exposure of theinput image202 by shifting the exposure of theinput image202 by the predicted exposure offset212, thereby producing an exposure-corrected image214 (step112). Techniques that may be used to perform the exposure correction will be described below with respect toFIG. 11.
Various techniques may optionally be applied to improve the contrast of theimage202. For example, the range of intensities in theimage202 may be stretched to cover the range of available intensities (e.g., [0,255]) as follows. The red channel R of theinput image202 may be linearized, thereby producing a linearized red channel RL, using the formula RL=(R/255)γ. Linearized green and blue channels GLand BLmay be produced similarly. Applying the gamma function to theRGB space image202 transforms it to a linear intensity space that more closely reflects the original dynamic range of theimage202. The term “RGB space” as used herein refers to any space in which an image may be captured (referred to herein as an “image capture space”).
A minimum intensity value mn and a maximum intensity value mx for theimage202 may be obtained using the following formulas: mn=min(RL,GL,BL) and mx=max(RL,GL,BL). The linearized red channel RLmay be stretched to produce a stretched linearized red channel RL′ using the formula RL′=(RL−mn)/(mx−mn). Stretched linearized green and blue channels GL′ and BL′ may be produced similarly. Subsequent operations (such as feature extraction) described below may be performed on the channels RL′, GL′, and BL′.
The channel RL′ may be transformed back into the channel R′ using the formula R=255(RL′)1/γ. The channels G′ and B′ may be obtained from the channels GL′ and BL′ in a similar manner. In the following description, operations that are described as being performed on channels RL, GL, and BLor R, G, and B may alternatively be performed on channels RL′, GL′, and BL′ or R′, G′, and B′.
The present invention may be used in conjunction with any individual features and with any combination of features. Examples of techniques that may be used to select an optimal set of features for extraction, from among an initial set of features, will be described below with respect toFIG. 9. Once a particular set of features is selected using such techniques, values208 of the selected features218 may be extracted from the image202 (step106). Particular examples of features that may be used in conjunction with embodiments of the present invention, and techniques for extracting values of such features, will now be described.
In one embodiment of the present invention, the size of theinput image202 is reduced as a part of feature extraction. Theinput image202 may be reduced because, in general, any particular image may carry a significant amount of information that is irrelevant to its desired exposure. For example, the chrominance channels of a color image are independent of the exposure of the image and therefore do not yield any information regarding the desired exposure. Such extraneous information may be discarded both to reduce the computational complexity of the exposure correction techniques described herein and to accurately estimate the coefficients526 (FIG. 6) of thepredictor216, as described in more detail below. Failure to exclude such extraneous information may decrease the accuracy of coefficient estimation because, in practice, there is only a limited training image set for which the desired exposure (ground truth) is known. This tends to bias the estimation of thepredictor coefficients526 when the number of images in the training set716 (FIG. 7D) cannot adequately support the number of selected features in the feature set208. Determining the optimal number of features is an important and difficult problem, and will be described in more detail below with respect toFIG. 9.
Referring toFIG. 1B, a flowchart is shown of a method that may be used to reduce theimage202 according to one embodiment of the present invention. Referring toFIG. 2B, a dataflow diagram220 is shown which illustrates the operations performed by themethod102 shown inFIG. 1B. Themethod102 generates a thumbnail222 of the input image202 (step122). In general, a thumbnail of an image is a reduced-dimension version of the image that is produced by some form of downsampling. Techniques for producing thumbnails are well-known to those of ordinary skill in the art.
In one embodiment of the present invention, it is assumed that most of the exposure information in theinput image202 is contained in the luminance channel of theimage202. Themethod102 therefore extracts a linear luminance channel224 (step124) and a non-linear luminance channel226 (step126) from the thumbnail222 so that subsequent processing is performed only on thelinear luminance channel224 and thenon-linear luminance channel226. If, for example, the R, G, and B channels of theinput image202 have been linearized into channels RL, GL, and BL, respectively, thelinear luminance channel224 andnon-linear luminance channel226 may be produced as follows. The linearizedluminance channel224, represented by LL, may be produced using the formula LL=aRL+bGL+cBL, where a, b, and c are constants. Thenon-linear luminance channel226, represented by L, may be produced from the linearized luminance channel224 LLby the following formula:
L=255(LL)1/γ.
Referring toFIG. 1C, a flowchart is shown of a method that may be used to extract additional features from theinput image202 according to one embodiment of the present invention. Referring toFIG. 2C, a dataflow diagram240 is shown which illustrates the operations performed by themethod140 shown inFIG. 1C. Theinput image202 is reduced to produce thelinear luminance channel224 and thenon-linear luminance channel226 using the techniques described above with respect toFIGS. 1B and 2B (step102).
A subset of thenon-linear luminance channel226 may be isolated to improve the performance of thepredictor216. A subset of thenon-linear luminance channel226, for example, rather than the entirenon-linear luminance channel226, may be used for exposure correction based on the observation that, in the typical case, all parts of an image do not equally influence our subjective judgment of the image's preferred exposure. For example, the desired exposure of an image with a subject standing in front of flat texture-less wall will most likely not be determined by the luminance values of the wall. The performance of thepredictor216 may therefore be improved by not using the wall pixels to compute the histogram described below. Although the factors that influence exposure are highly subjective, we have found that objects with little or no “activity” typically do not influence exposure. Thenon-linear luminance channel226 may therefore first be screened for activity to produce anactive image map206 which identifies the locations of pixels in thenon-linear luminance channel226 that satisfy the activity threshold (step104).
An activenon-linear luminance image209 may be extracted207 from thenon-linear luminance channel226 based on theactive image map206. Theactive image209 may also be referred to as the “active luminance channel” in embodiments which operate only upon thenon-linear luminance channel226 of theinput image202. Techniques that may be used to measure activity are described in more detail in commonly-owned U.S. Pat. No. 5,724,456 to Boyack et al., entitled “Brightness Adjustment of Images Using Digital Scene Analysis,” issued on Mar. 3, 1998 and incorporated by reference herein.
Themethod140 generates a histogram244 (referred to as the “active histogram,” or as the “active luminance histogram” (ALH) in embodiments in which only the luminance channel is being used) based on the active image209 (step130). Theactive histogram244 is one example of a feature whose value may be extracted instep106.
Using the entireactive histogram244 as a feature, however, may be inadequate to deal with images that are shot outdoors with the subject against a brightly lit background or with images that are shot indoors using the camera flash. Such images are distinguished by the fact that the exposures of the subject and background differ significantly from each other. If the subject occupies a small fraction of the image and there is sufficient activity in the background, an exposure predictor generated using the entireactive histogram244 will favor the background, thereby underexposing the subject in the outdoor case and overexposing the subject in the indoor case.
To address this problem, we introduce the concept of a region of interest (ROT): an area of the reduced image204 that is most likely to determine the exposure of theoriginal input image202. Themethod140 may identify a region ofinterest242 using thelinear luminance channel224 and the active image map206 (step132). Themethod140 may then generate a histogram246 of the portion of theactive image209 that is within the identified region of interest242 (step134). Both theactive histogram244 and the ROI histogram246 are examples of features that may be extracted instep106.
Themethod140 may also generate an average histogram248 by taking the weighted average of the ROT histogram246 and the active histogram244 (step136). The average histogram is another example of a feature that may be extracted instep106. Techniques for generating the average histogram248 will be described below with respect toFIG. 4.
Examples of techniques will now be described for using a variable-size rectangular window to search for and identify theROI242 of the thumbnail image222 (step132). In one embodiment of the present invention, theROT242 is defined as a rectangular window that satisfies the following conditions:
  • (1) the average activity within the ROT242 (defined as the ratio of the number of active pixels in the ROT to the total number of pixels in the ROI) is above a specified minimum activity threshold AMIN; and
    • (2) the absolute logarithm of the ratio of the average linear luminance within the ROI to the average linear luminance of the remaining image (i.e., the portion of the linearizedluminance channel224 not including the ROI242) is the highest in the thumbnail image222.
In this embodiment, the average luminance is computed over all the pixels in the linearizedluminance channel224 and not just the pixels in theactive image206.
Condition (1) ensures that theROI242 encompasses some interesting content of the thumbnail image222. Condition (2) serves to identify the subject in in-door flash scenes and out-door backlit scenes. For scenes in which there are no significant differences in luminance between any one portion of the linearizedluminance channel224 as compared to the rest of the linearizedluminance channel224, theROI242 encompasses an arbitrary region of theluminance channel226 satisfying both condition (1) and (2). However, in this case theROI242 will not have any significant contribution to the exposure of the final exposure-correctedimage214, a property that will become clear from the description below.
Referring toFIG. 3, a flowchart is shown of techniques that may be used to identify the region of interest242 (step132). First, the dimensions DR(e.g., width and height) of the region ofinterest242 are selected. The aspect ratio of the region ofinterest242 may be the same as the aspect ratio of the thumbnail222, and the base dimensions of the region ofinterest242 may be equal to a fixed fraction of the dimensions of the thumbnail222. For example, if DIrepresents the dimensions of the thumbnail222 (step302) and F is a predetermined fractional multiplier (step304), the base size BRof the region ofinterest242 may be set equal to F*DI(step306).
The actual dimensions DRof the region ofinterest242 scale linearly from the base dimensions BRwith the average activity of thenon-linear luminance channel226. In other words, if AIis the average activity of the entire active image map206 (step308), then the dimensions DRof the region ofinterest242 are equal to BR*AI(step310). Intuitively, an image with sparse activity will tend to have a small region of interest and vice versa. The scaling property represented bystep310 also helps the region ofinterest242 to pass condition (1).
Having selected the dimensions DRof the region ofinterest242, a region that satisfies both conditions (1) and (2), stated above, may be selected as the region ofinterest242. Referring again toFIG. 3, the region ofinterest242 may be selected as follows.
A variable LOGMAXis initialized to zero and a variable rROIis initialized to one (step312). The meaning of the values of LOGMAXand rROIwill become clear below. Themethod132 sets the value of a variable ROI_found to FALSE (step313). As its name implies, ROI_found indicates whether themethod132 has yet found a region of interest.
Themethod132 enters a loop over each candidate region C in the thumbnail222 (step314). The average activity ACof the region C is calculated (step316). Themethod132 determines whether AC≧AMIN(step318), thereby determining whether condition (1) is satisfied. If AC<AMIN, themethod132 continues to the next region (step334).
If AC≧AMIN, themethod132 calculates the average luminance LCof region C of the linear luminance channel224 (step322) and the average linear luminance LIof the remainder of the linear luminance channel224 (i.e., of the portion of thelinear luminance channel224 not including region C) (step324). The ratio of LCto LIis assigned to the variable rROI, and the absolute logarithm of rROI(i.e., |log2LC/LI|) is calculated and assigned to a variable named LOGCUR(step326).
Themethod132 determines whether the value of LOGCURis greater than the value of LOGMAX(step328). In other words, themethod132 determines whether the absolute log of the ratio of the average luminance in region C to the average luminance of the remainder of thelinear luminance channel224 is the highest encountered so far in thelinear luminance channel224. If LOGCURis greater than LOGMAX, a variable ROI is assigned the value of C (step330) and the variable LOGMAXis assigned the value of LOGCUR(step332). Because a region of interest has been found, the value of ROI_found is set to TRUE (step333). Steps316-333 are repeated for the remaining regions C in the thumbnail222 (step334).
Upon completion of the loop, themethod132 determines whether the value of ROI_found is equal to TRUE (step336). If it is, themethod132 terminates. Otherwise, a region of interest has not been found, and themethod132 sets the variable ROI to encompass the entire input image202 (step338).
Upon completion of themethod132, the value of the variable ROI identifies a region in theimage202 that satisfies both conditions (1) and (2), if such a region exists, and that may therefore be used as the region ofinterest242.
Referring toFIG. 4, a flowchart is shown of a method that may be used to extract additional features from the active ROI histogram246 and theactive histogram244. By way of background, associated with the region ofinterest242 may be a likelihood number that denotes the probability that the region ofinterest242 influences the desired exposure of theimage202. Let rROIbe the ratio of the average ROI linear luminance to the average linear luminance of the remaining image (i.e., LC/LI, where C=ROI). Then the likelihood associated with the region ofinterest242 is given by Equation 1 (FIG. 4, step402):
p(ROI)=11+exp(-s(log2rROI-o)),Equation1
where s and o are adjustable parameters. The parameter o represents the luminance difference in stops between the region ofinterest242 and the remaining image when the likelihood associated with the region ofinterest242 is 0.5. The parameter s is proportional to the slope of the likelihood function at |log2rROI|=o. Since p(ROI)→1.0 as |log2rROI|→∞, it follows that a large difference between the average linear luminance of the region ofinterest242 and the rest of thelinear luminance channel224 implies a higher likelihood of the region ofinterest242 influencing the final exposure of theimage202 and vice versa. Techniques will now be described for determining how the ROI likelihood p(ROI) weights the ROI contribution towards the final vector of feature values208.
Let HROI(•)denote the active ROI luminance histogram246 (FIG. 4, step404). Let HI(•) denote the active luminance histogram244 (step406). Then the overall average histogram H(•)248 is given by Equation 2 (step408):
H(•)=(1−p(ROI))HI(•)+p(ROI)HROI(•)  Equation 2
The histogram H(•)248 is an example of a feature that may be extracted instep106. However, the dimensionality of the feature space may be further reduced by extracting several linear and non-linear features from the histogram H(•)248 (steps410 and412). Such features are examples of features that may be extracted instep106. Examples of non-linear features that may be extracted include the different percentiles of the histogram H(•)248. Examples of linear features that may be extracted include the different moments of the histogram H(•)248.
As mentioned above, thepredictor216 may be generated (step108) based onground truth data210 for the set ofground truth images522. In one embodiment of the present invention, thepredictor216 is a linear predictor.
Let N be total number of feature values208 and let fidenote the ithfeature value in the feature values vector208. Let xidenote the coefficients526 (FIG. 6) of thepredictor216. Then the exposure shift prediction Δê212 may be generated instep110 as shown in Equation 3:
Δe^=i=0N-1xifi=xTf,Equation3
where f is the feature value vector208 and x is thecoefficient vector526.
Referring toFIG. 5, a flowchart is shown of a method500 for generating the predictor216 (step108) according to one embodiment of the present invention. Referring toFIG. 6, a dataflow diagram520 is shown illustrating the operations performed by themethod108 shown inFIG. 5. Themethod108 obtainsground truth data210 for all images in a set of ground truth images522 (step502). Examples of techniques for obtaining theground truth data210 will be described below with respect toFIG. 7B. Themethod108 selects the optimal set offeatures218 based on the ground truth data210 (step504). Examples of techniques for selecting the feature set218 will be described below with respect toFIG. 9. Themethod108 computescoefficients526 for thepredictor216 based on theground truth data210 and the selected features218 (step506). Referring toFIG. 6, step506 may be performed, for example, by extracting532 a set of training features534 from a set oftraining images528 based on the selected features218, and by generating the predictor coefficients (step506) based on theground truth data210 and the training features534. The training set528 may be any set of images that is used to train thepredictor216 and is a subset of the ground truth set522. Themethod108 generates thepredictor216 based on the selected features218 and the selected coefficients526 (step508).
Themethod108 shown inFIG. 5 will now be described in more detail. Referring toFIG. 7A, a flowchart is shown of amethod700 that may be used to generate the ground truth data210 (FIG. 5, step502). Referring toFIG. 7B, a dataflow diagram750 is shown which illustrates the operations performed by themethod700 shown inFIG. 7A.
Theground truth data210 may be acquired by conducting a psychophysical scaling test in which human subjects are asked to determine the best exposure for each of theimages522a-din the ground truth set522. Although only fourimages522a-dare shown inFIG. 7B, in practice there may be a much larger number of ground truth images.
Themethod700 may enter a loop over each image I in the ground truth set210 (step702). Referring toFIG. 7C, a dataflow diagram720 is shown illustrating, by way of example, the generation ofground truth data736afor asingle image522ain the ground truth set522. For each human subject S in a plurality of human subjects722a-d(step704), themethod700 receives an indication from the subject of the desired exposure for image I (step706). For example, referring toFIG. 10, agraph1000 is shown of a family of exposure adjustment curves, the particular characteristics of which are described in more detail below. Each such curve may be applied to an image to perform a particular exposure correction on the image.
In one embodiment of the present invention, each of the exposure adjustment curves shown in thegraph1000 is applied to the image I, and the resulting exposure adjusted images are displayed to the subject S. Associated with each of the exposure adjustment curves is a single number Δeireflecting the particular exposure adjustment associated with the curve. The subject S selects a particular one of the exposure-adjusted images that the subject believes has the best exposure among all of the exposure adjusted images. The exposure correction Δeiassociated with the exposure adjustment curve corresponding to the image selected by the subject S is provided as the desired exposure indication instep706.
For example, as shown inFIG. 7C, subject722aindicates desiredexposure724aby selecting a particular one of the exposure-adjusted images as having the best exposure. Themethod700 similarly receives desired exposure indications from the remaining subjects (step708). For example, as shown inFIG. 7C, subject722bindicates desiredexposure724b,subject722cindicates desiredexposure724c,and subject722dindicates desiredexposure724d.
Themethod700 averages all of the exposure indications received in the loop in steps704-708 to produce a single exposure correction number Δe, referred to as the “ground truth data” for image I (step710). For example, as shown inFIG. 7C,ground truth data736ais produced forimage522a.The inverse of the variance of the desired exposures724a-dindicated by the subjects722a-dmay be used to weight the mean-square error in the design of thepredictor216. This allows the influence of any image in the determination of the prediction weights to be reduced when the subjects differed significantly in their opinions regarding the best exposure of image I.
Themethod700 generates ground truth data for the remaining images in the ground truth set522 using the same techniques (step712). For example, as shown inFIG. 7B,ground truth data736bmay be generated for training setimage522b,ground truth data736cmay be generated for ground truth setimage522c,andground truth data736dmay be generated for ground truth setimage522d.
Before describing how feature selection (step504) may be performed, techniques will be described for generating thepredictor coefficients526 given a particular set of features. For example, let e denote the column vector containing theground truth data210 for all of theimages522a-din the ground truth set522. Let the feature vectors of each of theimages522a-dof the ground truth set522 form the rows of a matrix F. Then the predictor coefficients x526 may be generated (step506) using Equation 4:
x=(FTF)−1FTe.  Equation 4
Alternatively, if W is a diagonal weight matrix computed using the inverse variance of theground truth data210, the coefficients x526 may be generated (step506) using Equation 5:
x=(FTWF)−1FTWe.  Equation 5
In either case, once the number and type of features are determined (FIG. 9), the coefficients x526 may be generated using the closed form expressions in Equation 4 or Equation 5. The remaining problem is to determine which features to select as the selected features218.
Examples of techniques will now be described for determining which and how many features are optimal for inclusion in the set of selected features218. In one embodiment of the present invention, the set ofground truth images522 is divided into two subsets: a training set (such as training set716 shown inFIG. 7D) and a test set (not shown). The training set is used to design a training predictor726 (FIG. 7D). The test set is used to test thepredictor726 and to compute the prediction error.
In practice, the number of ground truth images in the ground truth set522 is limited. Dividing the images in the ground truth set522 into subsets further limits the number of design or test samples, thereby potentially reducing the quality of thepredictor216. Referring toFIG. 8, a flowchart is shown of amethod800 that may be used to address this problem.
In general, themethod800 uses a leave-n-out approach which cycles through the entire ground truth image set522, n images at a time. In theparticular method800 shown inFIG. 8, n=1. In each cycle, only one image from the ground truth set522 is chosen for the test set and the rest of the images from the ground truth set522 are used to design thepredictor726. The prediction error is then computed on the single image in the test set. The entire procedure is repeated for the next image in the ground truth set522 and so on. The advantage of this method is that all images but one are used to design thepredictor726 and all images are used to test thepredictor726. This minimizes the bias in the design and test error of thepredictor726. However, the downside of this procedure is that the design procedure has to be repeated as many times as the number of images in the ground truth set522.
The procedure described generally above will now be described in more detail. Referring toFIG. 8, a loop is entered over each image I in the ground truth image set G522 (step802). The single image I is placed into the test set (step804), and all of theground truth images522a-dexcept for image I are placed into the training set (step806).
Referring toFIG. 7D, a dataflow diagram760 is shown which illustrates the calculation of aprediction error766aby themethod800 based on the test set image I762 and thecurrent training set716. Themethod800 generatestraining predictor coefficients744 based on the training set (step808) using, for example, Equation 4 or Equation 5. In particular, themethod800 may extract718features728 from the training set716 based on a current set offeatures734. Themethod800 may generatetraining coefficients744 based onground truth data770 for the training set716 and the extracted training set features728.
Themethod800 generates atraining predictor726 based on the current set offeature identifiers734 and the training predictor coefficients744 (based on the structure of Equation 3)(step810). The current set offeatures734 may be selected as described below with respect toFIG. 9.
Themethod800 extracts the current selected features734 from the test set image I762-to produce test set image features746 (step812). Themethod800 uses thetraining predictor726 to generate a predictedexposure shift768 for the test setimage762 based on the extracted features746 using Equation 3 (step814). Themethod800 calculates aprediction error EI766afor the test setimage762 by subtracting the predictedexposure shift768 from theground truth data764 for the test set image I762 (step816). Prediction errors are generated in the same manner for the remaining images in the ground truth set G522 (step818).
Referring toFIG. 7E, for example, a dataflow diagram754 is shown illustrating the generation of a plurality ofprediction errors766, one for each of theimages522a-din the ground truth set522. As each of theground truth images522a-dis used as the test setimage762, a corresponding prediction error is generated. For example,prediction error766ais generated forimage522a,prediction error766bis generated forimage522b,prediction error766cis generated forimage522c, andprediction error766dis generated forimage522d. Onceprediction errors EI766a-dare generated for each image in the ground truth set522, the root mean square (RMS) is taken of all of theprediction errors EI766a-dto produce an averageprediction error E758 for the ground truth set522 (step820). The averageprediction error E758 may be used to select an optimal number and combination of features for use as the selected features218, as will now be described in more detail.
Referring toFIG. 9, a flowchart is shown of amethod900 that may be used to select an optimal number and combination of features for use as the selected features218. All available features are placed into a master feature set M (step902). The master set of features may be selected in any manner. Examples of features that may be placed into the master feature set M include the ROI histogram246, theactive histogram244, the average histogram248, and linear and non-linear features extracted from the features just listed. A current feature set C is initialized to a null set (step904), a global error value EGis initialized to infinity (step906), and a minimum error value EMINis initialized to infinity (step908).
A loop is entered over each feature F in the master set M (step910). A set S is formed by adding the feature F to the current feature set C (step912). Themethod800 shown inFIG. 8 is used to compute an average leave-n-out error E for theimages522a-din the ground truth image set522 using set S as the set of current selected features734 (step800).
If the average error E is less than the minimum error EMIN(step914), the minimum error EMINis assigned the value of E (step916) and a variable FMINis assigned the value of F (the current feature) (step918). The loop initiated instep910 is repeated for each of the features F in the current set C (step920).
Upon completion of the loop performed in steps910-920, EMINcontains the minimum leave-n-out prediction error obtained in any iteration of the loop, and FMINindicates the feature that resulted in the minimum error EMIN. Themethod900 determines whether the minimum error EMINis greater than the global error EG(step922). If EMIN>EG, the current feature set C is provided as the set of selected features218. If EMIN≦EG, then EGis assigned the value of EMIN(step924), and the feature FMINis added to the current feature set C and removed from the master set M (step926).
Themethod900 determines whether the master set M is empty (step928). If the master set M is not empty, the procedure described above with respect to steps910-926 is performed again using the updated master set M and current set C. If the master set M is empty, the current feature set C is provided as the set of selected features218.
In summary, themethod900 identifies the single best feature that results in the minimum average prediction error in the first iteration of the loop initiated instep910. In the second iteration of the loop, themethod900 identifies the next best feature that in combination with the first feature achieves the minimum error. Themethod900 continues in this fashion until the minimum average prediction error EMINeventually starts to increase with the addition of more features. At this point themethod900 terminates. The features that are in the current set C upon termination of themethod900 represent a set of optimal features that may be provided as the selected features218.
As described above with respect toFIG. 5 andFIG. 6, once the selected features218 are selected (FIG. 9), thepredictor coefficients526 may be generated based on theground truth data210 andfeature identifiers218 using, for example, Equation 4 or Equation 5 (step506), and thepredictor216 may be generated based on thefeatures218 and thecoefficients526 using, for example, the structure of Equation 3 (step508).
Note that if the entire ground truth set522 were used to design and test thepredictor216, the minimum error EMINwould always decrease upon the addition of a new feature to the current set C. In fact, if there are m images in the ground truth set522, the minimum error EMINbe made to be exactly zero by choosing m independent features. This follows from the fact that the column space of F spans the ground truth vector e. In such a case, the predictor that is generated may not be optimal. Rather, the predictor that is generated may merely predict the m images in the ground truth set522 perfectly, while the performance for other images may not be specified. In practice, a predictor designed in this fashion may perform poorly in the field because it may not generalize its prediction well enough for images outside the ground truth set. By testing with a set that is independent of the training set, we ensure that only those features that generalize well for other images are included in the final feature set218 and features that just fit the noise are excluded.
Having described how to generate thepredictor216, techniques will now be disclosed for changing (correcting) the exposure of theimage202 to the desired value (step112). An algorithm that causes an exposure change should not alter the color balance of the image in the process. This may be achieved by operating solely on the luminance channel of the image in a luminance/chrominance space. Alternatively, if the exposure correction algorithm operates in the RGB space, the same transformation should be applied to all of the channels so as not to alter the color balance of the image. Techniques using the latter approach will now be described because it is desirable to transform the image such that at least one of its channels occupies the entire gray scale, and it is particularly easy to do this in the RGB space.
Once thepredictor216 is generated, the predicted exposure offset212 for theimage202 may be generated based on the extracted feature values208 using Equation 3 (step110). Referring toFIG. 11, a flowchart is shown of a method for applying the exposure offset212 to theinput image202 to produce the exposure-corrected image214 (step112) according to one embodiment of the present invention. Themethod112 transforms theinput image202 from RGB space back to intensities in the original scene (i.e., the world intensity space) (step1102). Themethod112 performs exposure correction on the transformed image (step1104). The method1100 transforms the exposure-corrected image back from world intensity space to RGB space to produce the exposure-corrected image214 (step1106).
In one embodiment of the present invention, the forward transformation from the world log intensity space to the RGB space (step1106) is modeled by an S-shaped curve that serves to compress the tones in the highlight and the shadow regions. This is followed by a gamma function designed to model the inverse response of a typical monitor. The combination of the S-shaped tone reproduction curve and gamma forms a complete forward transformation represented herein as T(•).
Let i denote the world log intensity. Then T(i) is defined by Equation 6:
T(i)=(A+Btanh(−s(i+o)))1/γ,   Equation 6
where A, B, s and o are parameters of the S-shaped tone reproduction curve and γ is the monitor gamma. It should be appreciated that the parameters s and o in Equation 6 are not the same as the parameters s and o inEquation 1.
The reverse transformation from RGB space to log world intensity space (step1102) for a particular gray level g in RGB space may therefore be represented as T−1(g). The exposure correction of gray level g by a desired exposure offset Δe (measured in stops) in world intensity space (steps1102 and1104) may therefore be represented by T−1(g)+Δe. The complete exposure correction of an RGB-space gray level g, including reverse and forward transformations, performed by the exposure correction method1100 illustrated inFIG. 11, may therefore be represented by Equation 7:
g′=T(T−1(g)+Δe),   Equation 7
where g′ is the exposure-corrected gray level in RGB space. Thegraph1000 inFIG. 10 illustrates a family of curves, each of which corresponds to a different value of Δe.
Once the predicted exposure offset Δe212 is generated, the result of Equation 7 may be calculated for all gray levels, and pairs of gray levels and corresponding corrected gray levels may be stored in a lookup table (LUT). Exposure correction may thereafter be performed on each channel of an image using the lookup table rather than by calculating the results of Equation 7 for each pixel or gray level in the image, thereby significantly increasing the speed with which exposure correction may be performed.
One advantage of the techniques just described is that they perform exposure correction based on a model that models a mapping from world intensity space to the intensity space (e.g., RGB space) of the capturedimage202. As described above, the model includes a gamma function that models the response of a typical monitor and an S-shaped curve that compresses the large dynamic range of the world to the small dynamic range of the image capture (e.g., RGB) space. Using such a model enables the exposure of theimage202 to be corrected by employing the inverse of the model to transform the image to logarithmic intensities in the world, adding or subtracting an offset (given by the desired exposure correction) from the image, and then mapping the image back to the RGB digit space using the above model. One advantage of using such a model is that it enables exposure corrections to be applied in the world intensity space, where such corrections are more likely to have their intended effect across the full range of intensities, assuming that the model reasonably reflects the transfer function that was used to capture theimage202.
Embodiments of the present invention may be integrated with the color mapping process that is typically performed on digital images when output to a rendering device such as a printer. For example, referring toFIG. 12, a flowchart is shown of amethod1200 in which color mapping and exposure correction are integrated according to one embodiment of the present invention. Themethod1200 receives an image from a source such as a digital camera (step1202) and performs JPEG decompression on the image (step1204). Themethod1200 reduces the image using the techniques described above with respect toFIGS. 1B and 2B (step102). Themethod1200 then performs automatic color balancing and automatic exposure correction on the image using an integrated process. Color balancing, for example, is often performed in the RGB space using three one-dimensional lookup tables. Such lookup tables may be combined with the exposure correction lookup tables described above to generate three one-dimensional lookup tables that perform both color balancing and exposure correction with a single set of one-dimensional lookups.
For example, exposure correction estimation may be performed (step1218) using the techniques disclosed herein to generate three one-dimensional exposure correction lookup tables (step1219). Three one-dimensional color-balancing lookup tables may also be computed (step1220) and combined with the exposure correction lookup tables generated in step1219 (step1222). Themethod1200 may perform any of a variety of image processing steps on the decompressed image, such as rotating the image (step1206) and sharpening the image (step1208). These particular image processing steps are shown merely for purposes of example and do not constitute limitations of the present invention.
Themethod1200 performs color mapping on the image (step1210). Color mapping often involves several operations, including a one-dimensional pre-lookup table, a three-dimensional matrix or three-dimensional lookup, and a one-dimensional post-lookup table. Exposure correction may be integrated into the one-dimensional pre-lookup table operation of color mapping using the single set of three one-dimensional lookup tables (generated in step1222) that perform the combined function of exposure correction, color balance, and the one-dimensional pre-lookup table portion of color mapping.
The method prepares the image for printing (or other output) by upsizing the image (step1214). Themethod1200 then prints the image (step1216). It should be appreciated that various steps in the method1200 (such assteps1204,1206,1208,1214, and1216) are provided merely as examples of steps that may be performed in conjunction with processing of theinput image202 and do not constitute limitations of the present invention.
One advantage of the techniques disclosed herein is that they may operate in the RGB space, thereby making them susceptible to being integrated with color mapping as just described. Integrating exposure correction with color mapping reduces the number of steps that are required to optimize an image for printing and may therefore make it possible to perform such processing more quickly than other methods which correct image exposure in a luminance-chrominance space or other non-linear space.
It is to be understood that although the invention has been described above in terms of particular embodiments, the foregoing embodiments are provided as illustrative only, and do not limit or define the scope of the invention. Various other embodiments, including but not limited to the following, are also within the scope of the claims.
Elements and components described herein may be further divided into additional components or joined together to form fewer components for performing the same functions.
The techniques described above may be implemented, for example, in hardware, software, firmware, or any combination thereof. The techniques described above may be implemented in one or more computer programs executing on a programmable computer including a processor, a storage medium readable by the processor (including, for example, volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Program code may be applied to input entered using the input device to perform the functions described and to generate output. The output may be provided to one or more output devices.
Each computer program within the scope of the claims below may be implemented in any programming language, such as assembly language, machine language, a high-level procedural programming language, or an object-oriented programming language. The programming language may, for example, be a compiled or interpreted programming language.
Each such computer program may be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a computer processor. Method steps of the invention may be performed by a computer processor executing a program tangibly embodied on a computer-readable medium to perform functions of the invention by operating on input and generating output. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, the processor receives instructions and data from a read-only memory and/or a random access memory. Storage devices suitable for tangibly embodying computer program instructions include, for example, all forms of non-volatile memory, such as semiconductor memory devices, including EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROMs. Any of the foregoing may be supplemented by, or incorporated in, specially-designed ASICs (application-specific integrated circuits). A computer can generally also receive programs and data from a storage medium such as an internal disk (not shown) or a removable disk. These elements will also be found in a conventional desktop or workstation computer as well as other computers suitable for executing computer programs implementing the methods described herein, which may be used in conjunction with any digital print engine or marking engine, display monitor, or other raster output device capable of producing color or gray scale pixels on paper, film, display screen, or other output medium.

Claims (11)

What is claimed is:
1. A method for correcting the exposure of a source image, the method comprising steps of:
(A) transforming the source image from an image capture space into a nonlinear intensity space to produce a first transformed image;
(B) correcting the exposure of the transformed image in the nonlinear intensity space to produce a corrected transformed image; and
(C) transforming the corrected transformed image back into the image capture space to produce a second transformed image by steps comprising:
(C)(1) transforming the corrected transformed image into a third transformed image using an S-shaped curve; and
(C)(2) transforming the third transformed image into the second transformed image using a gamma functions;
wherein the step (C) comprises a step of transforming the corrected transformed image into the second transformed image using the formula:

T(i)=(A+Btanh(−s(+o)))1/γ
wherein i represents an intensity in the nonlinear intensity space, wherein A, B. s and o are parameters of said S-shaped curve and wherein γ is a gamma value of an output device for rendering the corrected transformed image.
2. The method ofclaim 1,
wherein the formula transforms intensities i in the nonlinear intensity space into gray levels in the image capture space and
wherein the step (A) comprises a step of transforming gray levels g in the source image by applying the function T(g) to the gray levels to produce transformed intensities.
3. The method ofclaim 2, wherein the step (B) comprises a step of adding an exposure offset Δe to the transformed intensities to produce shifted transformed intensities.
4. The method ofclaim 1, wherein the steps (A)-(C) are performed using at least one lookup table which maps intensities in the image capture space to transformed intensities in the image capture space according to steps (A)-(C).
5. The method ofclaim 4, wherein the at least one lookup table further performs color mapping on the source image.
6. An apparatus for correcting the exposure of a source image, the apparatus comprising:
first transformation means for transforming the source image from an image capture space into a nonlinear intensity space to produce a first transformed image;
correction means for correcting the exposure of the transformed image in the nonlinear intensity space to produce a corrected transformed image; and
second transformation means for transforming the corrected transformed image into the image capture space to produce a second transformed image, wherein the second transformation means comprises:
means for transforming the corrected transformed image into a third transformed image using an S-shaped curve; and
means for transforming the third transformed image into the second transformed image using a gamma function;
wherein the second transformation means comprises means for transforming the corrected transformed image into the second transformed image using the formula:

T(i)=(A+Btanh(−s(i+o)))1/γ
wherein i represents an intensity in the nonlinear intensity space, wherein A, B. s and o are parameters of said S-shaped curve and wherein γ is a gamma value of an output device for rendering the corrected transformed image.
7. The apparatus ofclaim 6, wherein the formula transforms intensities i in the nonlinear intensity space into gray levels in the image capture space and
wherein the first transformation means comprises means transforming gray levels g in the source image by applying the function T−1(g) to the gray levels to produce transformed intensities.
8. The apparatus ofclaim 7, wherein the correction means comprises means for adding an exposure offset Δe to the transformed intensities to produce shifted transformed intensities.
9. The apparatus ofclaim 6, wherein the first transformation means, the correction means, and the second transformation means are implemented in at least one lookup table which maps intensities in the image capture space to transformed intensities in the image capture space.
10. The apparatus ofclaim 9, wherein the at least one lookup table further performs color mapping on the source image.
11. A computer-readable medium having computer-executable instructions for correcting the exposure of a source image, the computer-executable instructions performing:
(A) transforming the source image from an image capture space into a nonlinear intensity space to produce a first transformed image;
(B) correcting the exposure of the transformed image in the nonlinear intensity space to produce a corrected transformed image; and
(C) transforming the corrected transformed image back into the image capture space to produce a second transformed image by steps comprising:
(C)( 1) transforming the corrected transformed image into a third transformed image using an S-shaped curve; and
(C)(2) transforming the third transformed image into the second transformed image using a gamma function;
wherein the step (C) comprises a step of transforming the corrected transformed image into the second transformed image using the formula:

T(i)=(A+Btanh(−s(i+o)))l/γ
wherein i represents an intensity in the nonlinear intensity space, wherein A, B, s and o are parameters of said S-shaped curve and wherein γ is a gamma value of an output device for rendering the corrected transformed image.
US10/375,4402003-02-272003-02-27Digital image exposure correctionExpired - LifetimeUS7283666B2 (en)

Priority Applications (7)

Application NumberPriority DateFiling DateTitle
US10/375,440US7283666B2 (en)2003-02-272003-02-27Digital image exposure correction
PCT/US2004/004964WO2004077816A2 (en)2003-02-272004-02-19Digital image exposure correction
EP04712899AEP1597911A2 (en)2003-02-272004-02-19Digital image exposure correction
JP2005518583AJP2006515136A (en)2003-02-272004-02-19 Digital image exposure compensation
US11/546,633US7826660B2 (en)2003-02-272006-10-12Digital image exposure correction
JP2008213280AJP2009005395A (en)2003-02-272008-08-21Correction of digital image exposure
US12/874,809US8265420B2 (en)2003-02-272010-09-02Digital image exposure correction

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US10/375,440US7283666B2 (en)2003-02-272003-02-27Digital image exposure correction

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US11/546,633DivisionUS7826660B2 (en)2003-02-272006-10-12Digital image exposure correction

Publications (2)

Publication NumberPublication Date
US20040170316A1 US20040170316A1 (en)2004-09-02
US7283666B2true US7283666B2 (en)2007-10-16

Family

ID=32907818

Family Applications (3)

Application NumberTitlePriority DateFiling Date
US10/375,440Expired - LifetimeUS7283666B2 (en)2003-02-272003-02-27Digital image exposure correction
US11/546,633Expired - LifetimeUS7826660B2 (en)2003-02-272006-10-12Digital image exposure correction
US12/874,809Expired - Fee RelatedUS8265420B2 (en)2003-02-272010-09-02Digital image exposure correction

Family Applications After (2)

Application NumberTitlePriority DateFiling Date
US11/546,633Expired - LifetimeUS7826660B2 (en)2003-02-272006-10-12Digital image exposure correction
US12/874,809Expired - Fee RelatedUS8265420B2 (en)2003-02-272010-09-02Digital image exposure correction

Country Status (4)

CountryLink
US (3)US7283666B2 (en)
EP (1)EP1597911A2 (en)
JP (2)JP2006515136A (en)
WO (1)WO2004077816A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20080075383A1 (en)*2006-09-222008-03-27Peng WuMethods And Systems For Identifying An Ill-Exposed Image
US7826660B2 (en)2003-02-272010-11-02Saquib Suhail SDigital image exposure correction
US7907157B2 (en)2002-02-192011-03-15Senshin Capital, LlcTechnique for printing a color image
USRE42473E1 (en)2001-05-302011-06-21Senshin Capital, LlcRendering images utilizing adaptive error diffusion
USRE43149E1 (en)2001-03-272012-01-31Senshin Capital, LlcMethod for generating a halftone of a source image
US8773685B2 (en)2003-07-012014-07-08Intellectual Ventures I LlcHigh-speed digital image printing system

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CA2448327A1 (en)*2001-05-302002-12-05Polaroid CorporationA high speed photo-printing apparatus
US6842186B2 (en)*2001-05-302005-01-11Polaroid CorporationHigh speed photo-printing apparatus
US6950211B2 (en)*2001-07-052005-09-27Corel CorporationFine moire correction in images
US20040181757A1 (en)*2003-03-122004-09-16Brady Deborah A.Convenient accuracy analysis of content analysis engine
JP4341295B2 (en)*2003-05-162009-10-07セイコーエプソン株式会社 Judging backlit human images
US7375854B2 (en)*2004-03-122008-05-20Vastview Technology, Inc.Method for color correction
US8004511B2 (en)2004-12-022011-08-23Sharp Laboratories Of America, Inc.Systems and methods for distortion-related source light management
US7982707B2 (en)2004-12-022011-07-19Sharp Laboratories Of America, Inc.Methods and systems for generating and applying image tone scale adjustments
US7782405B2 (en)2004-12-022010-08-24Sharp Laboratories Of America, Inc.Systems and methods for selecting a display source light illumination level
US7961199B2 (en)*2004-12-022011-06-14Sharp Laboratories Of America, Inc.Methods and systems for image-specific tone scale adjustment and light-source control
US8111265B2 (en)2004-12-022012-02-07Sharp Laboratories Of America, Inc.Systems and methods for brightness preservation using a smoothed gain image
US8913089B2 (en)2005-06-152014-12-16Sharp Laboratories Of America, Inc.Methods and systems for enhancing display characteristics with frequency-specific gain
US8922594B2 (en)2005-06-152014-12-30Sharp Laboratories Of America, Inc.Methods and systems for enhancing display characteristics with high frequency contrast enhancement
US7924261B2 (en)2004-12-022011-04-12Sharp Laboratories Of America, Inc.Methods and systems for determining a display light source adjustment
US7768496B2 (en)2004-12-022010-08-03Sharp Laboratories Of America, Inc.Methods and systems for image tonescale adjustment to compensate for a reduced source light power level
US7800577B2 (en)2004-12-022010-09-21Sharp Laboratories Of America, Inc.Methods and systems for enhancing display characteristics
US8120570B2 (en)2004-12-022012-02-21Sharp Laboratories Of America, Inc.Systems and methods for tone curve generation, selection and application
US8947465B2 (en)2004-12-022015-02-03Sharp Laboratories Of America, Inc.Methods and systems for display-mode-dependent brightness preservation
US7515160B2 (en)*2006-07-282009-04-07Sharp Laboratories Of America, Inc.Systems and methods for color preservation with image tone scale corrections
US9083969B2 (en)*2005-08-122015-07-14Sharp Laboratories Of America, Inc.Methods and systems for independent view adjustment in multiple-view displays
US7839406B2 (en)2006-03-082010-11-23Sharp Laboratories Of America, Inc.Methods and systems for enhancing display characteristics with ambient illumination input
JP5196731B2 (en)*2006-04-202013-05-15キヤノン株式会社 Image processing apparatus and image processing method
US8160364B2 (en)*2007-02-162012-04-17Raytheon CompanySystem and method for image registration based on variable region of interest
US7826681B2 (en)2007-02-282010-11-02Sharp Laboratories Of America, Inc.Methods and systems for surround-specific display modeling
KR101341095B1 (en)*2007-08-232013-12-13삼성전기주식회사Apparatus and method for capturing images having optimized quality under night scene conditions
US8155434B2 (en)2007-10-302012-04-10Sharp Laboratories Of America, Inc.Methods and systems for image enhancement
US8345038B2 (en)2007-10-302013-01-01Sharp Laboratories Of America, Inc.Methods and systems for backlight modulation and brightness preservation
US8378956B2 (en)2007-11-302013-02-19Sharp Laboratories Of America, Inc.Methods and systems for weighted-error-vector-based source light selection
US9177509B2 (en)2007-11-302015-11-03Sharp Laboratories Of America, Inc.Methods and systems for backlight modulation with scene-cut detection
US8207932B2 (en)2007-12-262012-06-26Sharp Laboratories Of America, Inc.Methods and systems for display source light illumination level selection
US8179363B2 (en)2007-12-262012-05-15Sharp Laboratories Of America, Inc.Methods and systems for display source light management with histogram manipulation
US8223113B2 (en)2007-12-262012-07-17Sharp Laboratories Of America, Inc.Methods and systems for display source light management with variable delay
US8203579B2 (en)2007-12-262012-06-19Sharp Laboratories Of America, Inc.Methods and systems for backlight modulation with image characteristic mapping
US8169431B2 (en)2007-12-262012-05-01Sharp Laboratories Of America, Inc.Methods and systems for image tonescale design
US8531379B2 (en)2008-04-282013-09-10Sharp Laboratories Of America, Inc.Methods and systems for image compensation for ambient conditions
US8416179B2 (en)2008-07-102013-04-09Sharp Laboratories Of America, Inc.Methods and systems for color preservation with a color-modulated backlight
US8208762B1 (en)*2008-08-122012-06-26Adobe Systems IncorporatedOptimizing the performance of an image editing system in a client-server environment
US9330630B2 (en)2008-08-302016-05-03Sharp Laboratories Of America, Inc.Methods and systems for display source light management with rate change control
TWI464706B (en)*2009-03-132014-12-11Micro Star Int Co Ltd Dark portion exposure compensation method for simulating high dynamic range with single image and image processing device using the same
JP5295854B2 (en)*2009-04-282013-09-18株式会社レグラス Image processing apparatus and image processing program
US8165724B2 (en)2009-06-172012-04-24Sharp Laboratories Of America, Inc.Methods and systems for power-controlling display devices
TWI407777B (en)*2009-07-202013-09-01Silicon Integrated Sys CorpApparatus and method for feature-based dynamic contrast enhancement
US20110205397A1 (en)*2010-02-242011-08-25John Christopher HahnPortable imaging device having display with improved visibility under adverse conditions
JP5744510B2 (en)*2010-12-282015-07-08キヤノン株式会社 Image processing method
CN102625030B (en)*2011-02-012014-10-01株式会社理光video enhancement method and system
US8644638B2 (en)*2011-02-252014-02-04Microsoft CorporationAutomatic localized adjustment of image shadows and highlights
US9111174B2 (en)*2012-02-242015-08-18Riverain Technologies, LLCMachine learnng techniques for pectoral muscle equalization and segmentation in digital mammograms
FR3002824A1 (en)*2013-03-042014-09-05St Microelectronics Grenoble 2 METHOD AND DEVICE FOR GENERATING IMAGES WITH A HIGH DYNAMIC RANGE
CN104298982B (en)*2013-07-162019-03-08深圳市腾讯计算机系统有限公司A kind of character recognition method and device
WO2015133712A1 (en)*2014-03-062015-09-11삼성전자 주식회사Image decoding method and device therefor, and image encoding method and device therefor
GB2520822B (en)*2014-10-102016-01-13Aveva Solutions LtdImage rendering of laser scan data
US9911061B2 (en)*2015-06-072018-03-06Apple Inc.Fast histogram-based object tracking
CN105006019B (en)*2015-07-132017-11-28山东易创电子有限公司A kind of sequence chart exposure method of adjustment and device
US10152935B2 (en)*2016-02-292018-12-11Mitsubishi Electric CorporationColor correction apparatus, display apparatus, and color correction method
CN107220953A (en)2017-06-162017-09-29广东欧珀移动通信有限公司image processing method, device and terminal
CN107291473B (en)*2017-06-222020-12-08深圳传音通讯有限公司Wallpaper setting method and device
US10628929B2 (en)*2018-05-282020-04-21Augentix Inc.Method and computer system of image enhancement
CN109447915B (en)*2018-10-292021-06-29北京康拓红外技术股份有限公司Line scanning image quality improving method based on characteristic model establishment and gamma gray correction
JP6757392B2 (en)2018-11-212020-09-16株式会社モルフォ Image generator, image generation method and image generation program

Citations (34)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4154523A (en)1977-05-311979-05-15Eastman Kodak CompanyExposure determination apparatus for a photographic printer
US4168120A (en)1978-04-171979-09-18Pako CorporationAutomatic exposure corrections for photographic printer
US4933709A (en)1989-09-251990-06-12Eastman Kodak CompanyAdjusting photographic printer color exposure determination algorithms
US4962403A (en)1989-12-111990-10-09Eastman Kodak CompanyAdjusting photographic printer color exposure determination algorithms
JPH04119338A (en)1990-09-101992-04-20Nikon Corp Camera photometry calculation device
JPH06308632A (en)1993-04-261994-11-04Fuji Photo Film Co LtdExposure control method
EP0762736A2 (en)1995-08-301997-03-12Hewlett-Packard CompanyAutomatic color processing to correct hue shift and incorrect exposure
EP0773470A1 (en)1995-11-091997-05-14Fuji Photo Film Co., Ltd.Image processing method for photographic printer
JPH09138465A (en)1995-10-271997-05-27Samsung Aerospace Ind Ltd Photo printing equipment
US5724456A (en)1995-03-311998-03-03Polaroid CorporationBrightness adjustment of images using digital scene analysis
US5809164A (en)1996-03-071998-09-15Polaroid CorporationSystem and method for color gamut and tone compression using an ideal mapping function
US5818975A (en)1996-10-281998-10-06Eastman Kodak CompanyMethod and apparatus for area selective exposure adjustment
US5835244A (en)*1993-10-151998-11-10Linotype-Hell AgMethod and apparatus for the conversion of color values
US5978106A (en)1996-06-211999-11-02Nikon CorporationPicture image processing method
WO2000004492A2 (en)1998-07-152000-01-27Imation Corp.Imaging system and method
JP2000050077A (en)1998-05-282000-02-18Eastman Kodak CoDigital photograph finishing system containing digital picture processing of selective acquisition color photograph medium
JP2000050080A (en)1998-05-282000-02-18Eastman Kodak CoDigital photograph finishing system containing digital picture processing of film exposure lacking gamma, scene balance, contrast normalization and picture visualization
US6028957A (en)*1996-03-072000-02-22Minolta Co., Ltd.Image forming apparatus having a noise removing unit
JP2000184270A (en)1998-12-142000-06-30Ricoh Co Ltd Digital still video camera
US6128415A (en)1996-09-062000-10-03Polaroid CorporationDevice profiles for use in a digital image processing system
US6133983A (en)1993-11-122000-10-17Eastman Kodak CompanyPhotographic printing method and apparatus for setting a degree of illuminant chromatic correction using inferential illuminant detection
EP1056272A1 (en)1999-05-202000-11-29Eastman Kodak CompanyCorrecting exposure in a rendered digital image
US6204940B1 (en)1998-05-152001-03-20Hewlett-Packard CompanyDigital processing of scanned negative films
KR20010037684A (en)1999-10-192001-05-15이중구Apparatus for correlating of exposure automatically of a digital still camera and method for performing the same
US6243133B1 (en)*1997-03-072001-06-05Eastman Kodak CompanyMethod for automatic scene balance of digital images
JP2001160908A (en)1999-12-022001-06-12Noritsu Koki Co Ltd Color density correction method, recording medium storing color density correction program, image processing device, and photographic printing device
US6263091B1 (en)1997-08-222001-07-17International Business Machines CorporationSystem and method for identifying foreground and background portions of digitized images
US6282317B1 (en)1998-12-312001-08-28Eastman Kodak CompanyMethod for automatic determination of main subjects in photographic images
JP2003008986A (en)2001-06-272003-01-10Casio Comput Co Ltd Imaging device and exposure control method
US6563945B2 (en)*1997-03-242003-05-13Jack M. HolmPictorial digital image processing incorporating image and output device modifications
US6608926B1 (en)*1998-06-242003-08-19Canon Kabushiki KaishaImage processing method, image processing apparatus and recording medium
US6628826B1 (en)*1999-11-292003-09-30Eastman Kodak CompanyColor reproduction of images from color films
US6650771B1 (en)*1999-11-222003-11-18Eastman Kodak CompanyColor management system incorporating parameter control channels
US6956967B2 (en)*2002-05-202005-10-18Eastman Kodak CompanyColor transformation for processing digital images

Family Cites Families (185)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3820133A (en)1972-07-241974-06-25C AdorneyTeaching device
US3864708A (en)1973-12-041975-02-04Brian S AllenAutomatic photographic apparatus and postcard vending machine
US4070587A (en)1975-02-141978-01-24Canon Kabushiki KaishaEnergizing control system for an intermittently energized device
US4072973A (en)1976-01-261978-02-07Mayo William DCamera signal system for portrait taking
US4089017A (en)1976-09-071978-05-09Polaroid CorporationAutomatic photostudio
JPH10285390A (en)1997-04-031998-10-23Minolta Co LtdImage processor
JPS5590383A (en)1978-12-271980-07-08Canon IncThermal printer
US4284876A (en)1979-04-241981-08-18Oki Electric Industry Co., Ltd.Thermal printing system
US4347518A (en)1979-09-041982-08-31Gould Inc.Thermal array protection apparatus
JPS6036397B2 (en)1980-03-311985-08-20株式会社東芝 thermal recording device
JPS574784A (en)1980-06-131982-01-11Canon IncThermal printer
US4385302A (en)1980-10-161983-05-24Fuji Xerox Co., Ltd.Multicolor recording apparatus
JPS57138960A (en)1981-02-201982-08-27Fuji Xerox Co LtdMulticolor heat sensitive recorder
DE3273429D1 (en)1981-06-191986-10-30Toshiba KkThermal printer
US4391535A (en)1981-08-101983-07-05Intermec CorporationMethod and apparatus for controlling the area of a thermal print medium that is exposed by a thermal printer
JPS58150370A (en)1982-03-021983-09-07Sony CorpProducing system of gradation signal for printer
US4514738A (en)1982-11-221985-04-30Tokyo Shibaura Denki Kabushiki KaishaThermal recording system
JPS59127781A (en)1983-01-111984-07-23Fuji Xerox Co LtdDriving circuit for thermal head
JPS59182758A (en)1983-04-011984-10-17Fuji Xerox Co LtdDrive circuit for thermal head
US4540992A (en)1983-04-071985-09-10Kabushiki Kaisha Daini SeikoshaThermal color transfer system
US4688051A (en)1983-08-151987-08-18Ricoh Company, Ltd.Thermal print head driving system
JPS6085675A (en)1983-10-171985-05-15Fuji Xerox Co LtdColor copying machine
JPS60101051A (en)1983-11-091985-06-05Fuji Xerox Co LtdMulticolor recording system
US4563691A (en)1984-12-241986-01-07Fuji Xerox Co., Ltd.Thermo-sensitive recording apparatus
US4884080A (en)1985-01-311989-11-28Kabushiki Kaisha ToshibaColor image printing apparatus
US5208686A (en)1985-03-011993-05-04Manchester R&D PartnershipLiquid crystal color display and method
DE3688715D1 (en)1985-03-301993-08-26Hitachi Ltd PRINTING METHOD OF THE SCANING TYPE AND ITS REALIZATION DEVICE.
US4686549A (en)1985-12-161987-08-11Minnesota Mining And Manufacturing CompanyReceptor sheet for thermal mass transfer printing
JPS62275768A (en)1986-05-241987-11-30Sony CorpPrinter
US4738526A (en)1986-11-211988-04-19Autostudio CorporationAuto-portrait photo studio
JPS63202182A (en)1987-02-181988-08-22Olympus Optical Co LtdTilted dot pattern forming method
US4739344A (en)1987-02-271988-04-19Astro-Med, Inc.Chart recorded having multiple thermal print heads
JPH02139258A (en)1988-08-181990-05-29Ricoh Co Ltd Recording density correction device
US5086484A (en)1988-08-241992-02-04Canon Kabushiki KaishaImage processing apparatus with fixed or variable threshold
DE68927970T2 (en)1988-09-081997-10-09Canon Kk Point image data output device
JPH02121853A (en)1988-10-311990-05-09Toshiba Corp Thermal head control circuit
JP2984009B2 (en)1989-02-031999-11-29株式会社リコー Thermal head drive
JPH0813552B2 (en)1989-02-171996-02-14松下電器産業株式会社 Gradation printer
JPH02235655A (en)1989-03-091990-09-18Kyocera Corp Thermal head drive device
US4907014A (en)*1989-05-181990-03-06Calcomp Inc.Safely retracting paper-cutting apparatus for a roll paper printer
US5086306A (en)1989-07-191992-02-04Ricoh Company, Ltd.Line head driving apparatus
JP2523188B2 (en)1989-08-071996-08-07シャープ株式会社 Printing control method of thermal printer
US5045952A (en)1989-08-211991-09-03Xerox CorporationMethod for edge enhanced error diffusion
JP2612616B2 (en)1989-08-311997-05-21富士写真フイルム株式会社 Method and apparatus for driving thermal head in printer
US5285220A (en)1989-11-221994-02-08Canon Kabushiki KaishaImage recording apparatus with tone correction for individual recording heads
US5046118A (en)*1990-02-061991-09-03Eastman Kodak CompanyTone-scale generation method and apparatus for digital x-ray images
WO1991015831A1 (en)1990-04-051991-10-17Seiko Epson CorporationPage description language interpreter
US5130821A (en)1990-04-161992-07-14Eastman Kodak CompanyMethod and apparatus for digital halftoning employing density distribution for selection of a threshold template
US5208684A (en)1990-04-261993-05-04Fujitsu LimitedHalf-tone image processing system
US5323245A (en)1990-09-141994-06-21Minnesota Mining And Manufacturing CompanyPerpendicular, unequal frequency non-conventional screen patterns for electronic halftone generation
US5268706A (en)1991-02-141993-12-07Alps Electric Co., Ltd.Actuating control method of thermal head
JP2957721B2 (en)1991-02-251999-10-06アルプス電気株式会社 Thermal control method of thermal head
US5132703A (en)1991-03-081992-07-21Yokogawa Electric CorporationThermal history control in a recorder using a line thermal head
US5132709A (en)1991-08-261992-07-21Zebra Technologies CorporationApparatus and method for closed-loop, thermal control of printing head
US5307425A (en)1991-09-021994-04-26Rohm Co., Ltd.Bi-level halftone processing circuit and image processing apparatus using the same
US5455685A (en)*1991-09-041995-10-03Fuji Photo Film Co., Ltd.Video camera exposure control apparatus for controlling iris diaphragm and automatic gain control operating speed
US5244861A (en)1992-01-171993-09-14Eastman Kodak CompanyReceiving element for use in thermal dye transfer
US5625399A (en)1992-01-311997-04-29Intermec CorporationMethod and apparatus for controlling a thermal printhead
US5777599A (en)1992-02-141998-07-07Oki Electric Industry Co., Ltd.Image generation device and method using dithering
JPH0654195A (en)1992-02-281994-02-25Eastman Kodak CoSystem and method for image scanner for improvement of microfilm image quality
JPH07205469A (en)1992-03-271995-08-08Nec Data Terminal LtdThermal head
JP3412174B2 (en)*1992-05-212003-06-03松下電器産業株式会社 Automatic exposure control device
JP3209797B2 (en)1992-07-032001-09-17松下電器産業株式会社 Gradation printer
JP2850930B2 (en)1992-10-121999-01-27日本ビクター株式会社 Melt type thermal transfer printing system
US5729274A (en)1992-11-051998-03-17Fuji Photo Film Co., Ltd.Color direct thermal printing method and thermal head of thermal printer
US5469203A (en)1992-11-241995-11-21Eastman Kodak CompanyParasitic resistance compensation for a thermal print head
US5644351A (en)1992-12-041997-07-01Matsushita Electric Industrial Co., Ltd.Thermal gradation printing apparatus
US5450099A (en)1993-04-081995-09-12Eastman Kodak CompanyThermal line printer with staggered head segments and overlap compensation
KR0138362B1 (en)1993-05-171998-05-15김광호 Thermal transfer printer device and method
US5805780A (en)1993-05-251998-09-08Dai Nippon Printing Co., Ltd.Photographing box
JP3397371B2 (en)1993-05-272003-04-14キヤノン株式会社 Recording device and recording method
US5818474A (en)1993-06-301998-10-06Canon Kabushiki KaishaInk-jet recording apparatus and method using asynchronous masks
US5479263A (en)1993-07-011995-12-26Xerox CorporationGray pixel halftone encoder
US5623297A (en)1993-07-071997-04-22Intermec CorporationMethod and apparatus for controlling a thermal printhead
US5956067A (en)1993-10-281999-09-21Nisca CorporationThermal transfer printing device and method
JP2746088B2 (en)1993-11-301998-04-28進工業株式会社 Thermal head device
JP3066237B2 (en)1993-12-212000-07-17フジコピアン株式会社 Thermal transfer material and color image forming method
JPH07178948A (en)1993-12-241995-07-18Shinko Electric Co LtdThermal printer
DE69433608T2 (en)1993-12-272005-02-17Sharp K.K. Grading control method and image quality improvement for a thermal printer
BE1008076A3 (en)*1994-02-151996-01-09Agfa Gevaert NvCOLOR NEGATIVE SCANNING AND TRANSFORMATION IN COLORS OF ORIGINAL scene.
US5497174A (en)1994-03-111996-03-05Xerox CorporationVoltage drop correction for ink jet printer
US5786900A (en)1994-03-231998-07-28Fuji Photo Film Co., Ltd.Image recording device for recording multicolor images with dot pitch pattern randomly arranged only in the sub-scanning direction
JP3381755B2 (en)1994-10-112003-03-04セイコーエプソン株式会社 Method and apparatus for improved adaptive filtering and thresholding to reduce image graininess
US5602653A (en)1994-11-081997-02-11Xerox CorporationPixel pair grid halftoning for a hyperacuity printer
US5786837A (en)1994-11-291998-07-28Agfa-Gevaert N.V.Method and apparatus for thermal printing with voltage-drop compensation
JP2702426B2 (en)1994-12-161998-01-21日本電気データ機器株式会社 Thermal head device
JPH08169132A (en)1994-12-201996-07-02Nec Data Terminal LtdThermal head device
US5694484A (en)1995-05-151997-12-02Polaroid CorporationSystem and method for automatically processing image data to provide images of optimal perceptual quality
US5835627A (en)1995-05-151998-11-10Higgins; Eric W.System and method for automatically optimizing image quality and processing time
US6128099A (en)1995-06-082000-10-03Delabastita; Paul A.Halftone screen generator, halftone screen and method for generating same
US5707082A (en)1995-07-181998-01-13Moore Business Forms IncThermally imaged colored baggage tags
US6657741B1 (en)1995-08-072003-12-02Tr Systems, Inc.Multiple print engine system with selectively distributed ripped pages
JPH0952382A (en)1995-08-171997-02-25Fuji Photo Film Co LtdMethod and apparatus for correcting heat accumulation
US5664253A (en)1995-09-121997-09-02Eastman Kodak CompanyStand alone photofinishing apparatus
JP3501567B2 (en)1995-09-282004-03-02富士写真フイルム株式会社 Color thermal printer
JP3523724B2 (en)1995-09-292004-04-26東芝テック株式会社 Thermal transfer color printer
JP4036896B2 (en)*1995-12-012008-01-23キネティック リミテッド Imaging system
US5623581A (en)1996-01-221997-04-22Apbi Interactive Kiosk SystemsDirect view interactive photo kiosk and image forming process for same
US5913019A (en)1996-01-221999-06-15Foto Fantasy, Inc.Direct view interactive photo kiosk and composite image forming process for same
JP3625333B2 (en)1996-02-132005-03-02富士写真フイルム株式会社 Thermal image recording apparatus and recording method
US5777638A (en)1996-02-221998-07-07Hewlett-Packard CompanyPrint mode to compensate for microbanding
US5956421A (en)1996-02-281999-09-21Canon Kabushiki KaishaImage processing method and apparatus for determining a binarization threshold value used for binarizing a multi-valued image and performing binarization processing
US5870505A (en)*1996-03-141999-02-09Polaroid CorporationMethod and apparatus for pixel level luminance adjustment
JP3589783B2 (en)1996-04-112004-11-17富士写真フイルム株式会社 Thermal storage correction method and device
US5909244A (en)*1996-04-151999-06-01Massachusetts Institute Of TechnologyReal time adaptive digital image processing for dynamic range remapping of imagery including low-light-level visible imagery
US5880777A (en)*1996-04-151999-03-09Massachusetts Institute Of TechnologyLow-light-level imaging and image processing
JP3426851B2 (en)1996-04-302003-07-14大日本スクリーン製造株式会社 Dot forming method for multicolor printing
US5889546A (en)1996-06-041999-03-30Shinko Electric Co., Ltd.Heat accumulation control device for line-type thermoelectric printer
US5809177A (en)1996-06-061998-09-15Xerox CorporationHybrid error diffusion pattern shifting reduction using programmable threshold perturbation
US5668638A (en)1996-06-271997-09-16Xerox CorporationError diffusion method with symmetric enhancement
US6233360B1 (en)1996-09-242001-05-15Xerox CorporationMethod and system for hybrid error diffusion processing of image information using adaptive white and black reference values
JPH10109436A (en)1996-10-041998-04-28Seiko Denshi Kiki KkColor image recording method, color image recording device, and color image recording controlling method
JP3907783B2 (en)*1996-12-122007-04-18富士フイルム株式会社 Color conversion method
JPH10239780A (en)1996-12-241998-09-11Fuji Photo Film Co LtdMethod and device for outputting photographic image data
US5970224A (en)1997-04-141999-10-19Xerox CorporationMultifunctional printing system with queue management
JPH1110852A (en)1997-06-241999-01-19Fuji Photo Film Co LtdMultihead type printer
US6771832B1 (en)*1997-07-292004-08-03Panasonic Communications Co., Ltd.Image processor for processing an image with an error diffusion process and image processing method for processing an image with an error diffusion process
JP3683387B2 (en)1997-08-012005-08-17シャープ株式会社 Network computer built-in printer and computer network system provided with the same
JPH1158807A (en)1997-08-111999-03-02Minolta Co LtdRecorder
JP3690082B2 (en)1997-09-112005-08-31コニカミノルタビジネステクノロジーズ株式会社 Selection method of image forming apparatus connected to network
US6334660B1 (en)1998-10-312002-01-01Hewlett-Packard CompanyVarying the operating energy applied to an inkjet print cartridge based upon the operating conditions
US6069982A (en)1997-12-232000-05-30Polaroid CorporationEstimation of frequency dependence and grey-level dependence of noise in an image
US6101000A (en)1998-01-302000-08-08Eastman Kodak CompanyPhotographic processing apparatus and method
US6172768B1 (en)*1998-02-052001-01-09Canon Kabushiki KaishaHalftoning with changeable error diffusion weights
US6223267B1 (en)1998-02-262001-04-24Hewlett-Packard CompanyDynamically allocable RAM disk
US6106173A (en)1998-03-062000-08-22Asahi Kogaku Kogyo Kabushiki KaishaImage-forming system including a plurality of thermal heads and an image-forming sheet with a plurality of types of micro-capsules
US6226021B1 (en)1998-04-032001-05-01Alps Electric Co., Ltd.Image forming method of thermal transfer printer
US6760489B1 (en)*1998-04-062004-07-06Seiko Epson CorporationApparatus and method for image data interpolation and medium on which image data interpolation program is recorded
US5995654A (en)1998-05-281999-11-30Eastman Kodak CompanyDigital photofinishing system including scene balance and image sharpening digital image processing
US6631208B1 (en)1998-05-292003-10-07Fuji Photo Film Co., Ltd.Image processing method
US6208429B1 (en)1998-05-292001-03-27Flashpoint Technology, Inc.Method and system for band printing of rotated digital image data
JP3590265B2 (en)1998-06-112004-11-17富士写真フイルム株式会社 Image processing method
US6694051B1 (en)*1998-06-242004-02-17Canon Kabushiki KaishaImage processing method, image processing apparatus and recording medium
US6104468A (en)1998-06-292000-08-15Eastman Kodak CompanyImage movement in a photographic laboratory
JP3556859B2 (en)*1998-09-082004-08-25富士写真フイルム株式会社 Image correction method, image correction device, and recording medium
JP3754849B2 (en)1998-10-302006-03-15キヤノン株式会社 Data communication apparatus, control method, storage medium, and image printing system
US6847376B2 (en)*1998-11-132005-01-25Lightsurf Technologies, Inc.Method and system for characterizing color display monitor output
JP3829508B2 (en)1998-11-272006-10-04セイコーエプソン株式会社 Image processing apparatus, image processing method, and printing apparatus
JP3820497B2 (en)*1999-01-252006-09-13富士写真フイルム株式会社 Imaging apparatus and correction processing method for automatic exposure control
JP3369497B2 (en)1999-01-272003-01-20松下電送システム株式会社 Terminal device and MFP
US6332137B1 (en)*1999-02-112001-12-18Toshikazu HoriParallel associative learning memory for a standalone hardwired recognition system
US6276775B1 (en)1999-04-292001-08-21Hewlett-Packard CompanyVariable drop mass inkjet drop generator
TW473696B (en)1999-06-292002-01-21Casio Computer Co LtdPrinting apparatus and printing method
JP2002096470A (en)1999-08-242002-04-02Canon Inc Recording device and control method thereof, computer readable memory
US6425699B1 (en)1999-09-292002-07-30Hewlett-Packard CompanyUse of very small advances of printing medium for improved image quality in incremental printing
CN1158184C (en)1999-09-292004-07-21精工爱普生株式会社 Printer and its control method
US6690488B1 (en)1999-09-302004-02-10Polaroid CorporationMethod and apparatus for estimating the spatial frequency response of a digital image acquisition system from the images it produces
US6628899B1 (en)1999-10-082003-09-30Fuji Photo Film Co., Ltd.Image photographing system, image processing system, and image providing system connecting them, as well as photographing camera, image editing apparatus, image order sheet for each object and method of ordering images for each object
US6856416B1 (en)1999-11-032005-02-15Toshiba Tech CorporationDynamic load balancing for a tandem printing system
JP3684951B2 (en)*1999-11-112005-08-17松下電器産業株式会社 Image search method and apparatus
GB2356375B (en)1999-11-222003-04-09Esselte NvA method of controlling a print head
JP2001186323A (en)*1999-12-242001-07-06Fuji Photo Film Co LtdIdentification photograph system and picture on processing method
EP1137247A3 (en)2000-01-282002-10-09Eastman Kodak CompanyPhotofinishing system and method
US6537410B2 (en)2000-02-012003-03-25Polaroid CorporationThermal transfer recording system
US7092116B2 (en)2000-06-292006-08-15Douglas CalawayMethod and system for processing an annotated digital photograph using a composite image
US6762855B1 (en)2000-07-072004-07-13Eastman Kodak CompanyVariable speed printing system
US6583852B2 (en)2000-09-212003-06-24Shutterfly, Inc.Apparatus, architecture and method for high-speed printing
JP3740403B2 (en)2000-10-232006-02-01キヤノン株式会社 Printing system, printing control apparatus, information processing method, control program
EP1201449A3 (en)2000-10-312003-05-14Hewlett-Packard CompanyA system and method for improving the edge quality of inkjet printouts
JP2002160395A (en)2000-11-222002-06-04Fuji Photo Film Co LtdMethod and device for recording image
US7272390B1 (en)2000-12-192007-09-18Cisco Technology, Inc.Method and system for sending facsimile transmissions from mobile devices
US7355732B2 (en)2000-12-222008-04-08Ricoh Company, Ltd.Printing mechanism for wireless devices
JP2002199221A (en)2000-12-272002-07-12Fuji Photo Film Co LtdDensity correction curve generating device and method
GB2386456B (en)2001-01-082005-03-09Hyperdrive Computers LtdComputer system
JP4662401B2 (en)2001-02-052011-03-30ローム株式会社 Printing method and thermal printer
JP4154128B2 (en)2001-02-142008-09-24株式会社リコー Image processing apparatus, image processing method, and recording medium on which program for executing the method is recorded
US7154621B2 (en)2001-03-202006-12-26Lightsurf Technologies, Inc.Internet delivery of digitized photographs
US6999202B2 (en)2001-03-272006-02-14Polaroid CorporationMethod for generating a halftone of a source image
US6842186B2 (en)2001-05-302005-01-11Polaroid CorporationHigh speed photo-printing apparatus
US6937365B2 (en)2001-05-302005-08-30Polaroid CorporationRendering images utilizing adaptive error diffusion
CA2448327A1 (en)2001-05-302002-12-05Polaroid CorporationA high speed photo-printing apparatus
US6826310B2 (en)*2001-07-062004-11-30Jasc Software, Inc.Automatic contrast enhancement
JP3634342B2 (en)2001-07-232005-03-30セイコーエプソン株式会社 Printing system and printing method
JP2003036438A (en)2001-07-252003-02-07Minolta Co LtdProgram for specifying red-eye in image, recording medium, image processor and method for specifying red- eye
US6819347B2 (en)2001-08-222004-11-16Polaroid CorporationThermal response correction system
US7133070B2 (en)*2001-09-202006-11-07Eastman Kodak CompanySystem and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data
WO2003039143A1 (en)2001-10-302003-05-08Nikon CorporationImage accumulation apparatus, image accumulation support apparatus, image accumulation system, image control apparatus, image storage apparatus
JP3992177B2 (en)*2001-11-292007-10-17株式会社リコー Image processing apparatus, image processing method, and computer program
US6906736B2 (en)2002-02-192005-06-14Polaroid CorporationTechnique for printing a color image
WO2003072362A1 (en)2002-02-222003-09-04Polaroid CorporationCommon mode voltage correction
JP3928704B2 (en)*2002-02-262007-06-13セイコーエプソン株式会社 Image processing apparatus, image processing method, medium storing image processing program, and image processing program
US7283666B2 (en)2003-02-272007-10-16Saquib Suhail SDigital image exposure correction
US20040179226A1 (en)2003-03-102004-09-16Burkes Theresa A.Accelerating printing
US8773685B2 (en)2003-07-012014-07-08Intellectual Ventures I LlcHigh-speed digital image printing system
JP4119338B2 (en)2003-09-302008-07-16三栄源エフ・エフ・アイ株式会社 Heat resistant filling material

Patent Citations (38)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4154523A (en)1977-05-311979-05-15Eastman Kodak CompanyExposure determination apparatus for a photographic printer
US4168120A (en)1978-04-171979-09-18Pako CorporationAutomatic exposure corrections for photographic printer
US4933709A (en)1989-09-251990-06-12Eastman Kodak CompanyAdjusting photographic printer color exposure determination algorithms
US4962403A (en)1989-12-111990-10-09Eastman Kodak CompanyAdjusting photographic printer color exposure determination algorithms
JPH04119338A (en)1990-09-101992-04-20Nikon Corp Camera photometry calculation device
JPH06308632A (en)1993-04-261994-11-04Fuji Photo Film Co LtdExposure control method
US5835244A (en)*1993-10-151998-11-10Linotype-Hell AgMethod and apparatus for the conversion of color values
US6133983A (en)1993-11-122000-10-17Eastman Kodak CompanyPhotographic printing method and apparatus for setting a degree of illuminant chromatic correction using inferential illuminant detection
US5724456A (en)1995-03-311998-03-03Polaroid CorporationBrightness adjustment of images using digital scene analysis
EP0762736A2 (en)1995-08-301997-03-12Hewlett-Packard CompanyAutomatic color processing to correct hue shift and incorrect exposure
EP0762736A3 (en)1995-08-301998-02-18Hewlett-Packard CompanyAutomatic color processing to correct hue shift and incorrect exposure
JPH09138465A (en)1995-10-271997-05-27Samsung Aerospace Ind Ltd Photo printing equipment
EP0773470A1 (en)1995-11-091997-05-14Fuji Photo Film Co., Ltd.Image processing method for photographic printer
US5781315A (en)1995-11-091998-07-14Fuji Photo Film Co., Ltd.Image processing method for photographic printer
US5809164A (en)1996-03-071998-09-15Polaroid CorporationSystem and method for color gamut and tone compression using an ideal mapping function
US6028957A (en)*1996-03-072000-02-22Minolta Co., Ltd.Image forming apparatus having a noise removing unit
US5978106A (en)1996-06-211999-11-02Nikon CorporationPicture image processing method
US6128415A (en)1996-09-062000-10-03Polaroid CorporationDevice profiles for use in a digital image processing system
US5818975A (en)1996-10-281998-10-06Eastman Kodak CompanyMethod and apparatus for area selective exposure adjustment
US6243133B1 (en)*1997-03-072001-06-05Eastman Kodak CompanyMethod for automatic scene balance of digital images
US6563945B2 (en)*1997-03-242003-05-13Jack M. HolmPictorial digital image processing incorporating image and output device modifications
US6628823B1 (en)*1997-03-242003-09-30Jack M. HolmPictorial digital image processing incorporating adjustments to compensate for dynamic range differences
US6263091B1 (en)1997-08-222001-07-17International Business Machines CorporationSystem and method for identifying foreground and background portions of digitized images
US6204940B1 (en)1998-05-152001-03-20Hewlett-Packard CompanyDigital processing of scanned negative films
JP2000050080A (en)1998-05-282000-02-18Eastman Kodak CoDigital photograph finishing system containing digital picture processing of film exposure lacking gamma, scene balance, contrast normalization and picture visualization
JP2000050077A (en)1998-05-282000-02-18Eastman Kodak CoDigital photograph finishing system containing digital picture processing of selective acquisition color photograph medium
US6608926B1 (en)*1998-06-242003-08-19Canon Kabushiki KaishaImage processing method, image processing apparatus and recording medium
WO2000004492A3 (en)1998-07-152001-10-25Imation CorpImaging system and method
WO2000004492A2 (en)1998-07-152000-01-27Imation Corp.Imaging system and method
JP2000184270A (en)1998-12-142000-06-30Ricoh Co Ltd Digital still video camera
US6282317B1 (en)1998-12-312001-08-28Eastman Kodak CompanyMethod for automatic determination of main subjects in photographic images
EP1056272A1 (en)1999-05-202000-11-29Eastman Kodak CompanyCorrecting exposure in a rendered digital image
KR20010037684A (en)1999-10-192001-05-15이중구Apparatus for correlating of exposure automatically of a digital still camera and method for performing the same
US6650771B1 (en)*1999-11-222003-11-18Eastman Kodak CompanyColor management system incorporating parameter control channels
US6628826B1 (en)*1999-11-292003-09-30Eastman Kodak CompanyColor reproduction of images from color films
JP2001160908A (en)1999-12-022001-06-12Noritsu Koki Co Ltd Color density correction method, recording medium storing color density correction program, image processing device, and photographic printing device
JP2003008986A (en)2001-06-272003-01-10Casio Comput Co Ltd Imaging device and exposure control method
US6956967B2 (en)*2002-05-202005-10-18Eastman Kodak CompanyColor transformation for processing digital images

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"Adaptive Margin Support Vector Machines" (XP-002299707), pp. 1-16, by Jason Weston and Ralf Herbrich.
"Algorithmic Stability and Sanity-Check Bounds for Leave-One-Out Cross-Validation" (XP-002299710), pp. 1-20, by Michael Kearns and Dana Ron, Jan. 1997.
"Automated Global Enhancement of Digitized Photographs", by Bhukhanwata et al., 8087 IEEE Transactions on Consumer Electronics, Feb. 1994.

Cited By (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
USRE43149E1 (en)2001-03-272012-01-31Senshin Capital, LlcMethod for generating a halftone of a source image
USRE42473E1 (en)2001-05-302011-06-21Senshin Capital, LlcRendering images utilizing adaptive error diffusion
US7907157B2 (en)2002-02-192011-03-15Senshin Capital, LlcTechnique for printing a color image
US7826660B2 (en)2003-02-272010-11-02Saquib Suhail SDigital image exposure correction
US8265420B2 (en)2003-02-272012-09-11Senshin Capital, LlcDigital image exposure correction
US8773685B2 (en)2003-07-012014-07-08Intellectual Ventures I LlcHigh-speed digital image printing system
US20080075383A1 (en)*2006-09-222008-03-27Peng WuMethods And Systems For Identifying An Ill-Exposed Image
US7865032B2 (en)*2006-09-222011-01-04Hewlett-Packard Development Company, L.P.Methods and systems for identifying an ill-exposed image

Also Published As

Publication numberPublication date
US20040170316A1 (en)2004-09-02
JP2006515136A (en)2006-05-18
JP2009005395A (en)2009-01-08
WO2004077816A2 (en)2004-09-10
WO2004077816A3 (en)2005-03-24
EP1597911A2 (en)2005-11-23
US20070036457A1 (en)2007-02-15
US7826660B2 (en)2010-11-02
US20100329558A1 (en)2010-12-30
US8265420B2 (en)2012-09-11

Similar Documents

PublicationPublication DateTitle
US7283666B2 (en)Digital image exposure correction
US9020257B2 (en)Transforming a digital image from a low dynamic range (LDR) image to a high dynamic range (HDR) image
US7023580B2 (en)System and method for digital image tone mapping using an adaptive sigmoidal function based on perceptual preference guidelines
JP3880553B2 (en) Image processing method and apparatus
US7778483B2 (en)Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject
US7436997B2 (en)Light source estimating device, light source estimating method, and imaging device and image processing method
US7065255B2 (en)Method and apparatus for enhancing digital images utilizing non-image data
US7840084B2 (en)Digital camera incorporating a sharpness predictor
US7436995B2 (en)Image-processing apparatus, image-capturing apparatus, image-processing method and image-processing program
US9432589B2 (en)Systems and methods for generating high dynamic range images
US6898312B2 (en)Method and device for the correction of colors of photographic images
JPH10187994A (en)Method for evaluating and controlling digital image contrast
US8345975B2 (en)Automatic exposure estimation for HDR images based on image statistics
CN100411445C (en) Image processing method and device for correcting image brightness distribution
KR102243292B1 (en)Method and Device for making HDR image by using color response curve, camera, and recording medium
US7570809B1 (en)Method for automatic color balancing in digital images
US20030214663A1 (en)Processing of digital images
JP2008533773A (en) Profiling digital image input devices
JPWO2003085989A1 (en) Image processing apparatus, image processing program, and image processing method

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:POLAROID CORPORATION, WASHINGTON

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAQUIB, SUHAIL S.;THORNTON, JAY E.;REEL/FRAME:013836/0682;SIGNING DATES FROM 20030226 TO 20030227

ASAssignment

Owner name:WILMINGTON TRUST COMPANY, AS COLLATERAL AGENT, DEL

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:POLAROLD HOLDING COMPANY;POLAROID CORPORATION;POLAROID ASIA PACIFIC LLC;AND OTHERS;REEL/FRAME:016602/0332

Effective date:20050428

Owner name:JPMORGAN CHASE BANK,N.A,AS ADMINISTRATIVE AGENT, W

Free format text:SECURITY INTEREST;ASSIGNORS:POLAROID HOLDING COMPANY;POLAROID CORPORATION;POLAROID ASIA PACIFIC LLC;AND OTHERS;REEL/FRAME:016602/0603

Effective date:20050428

Owner name:WILMINGTON TRUST COMPANY, AS COLLATERAL AGENT,DELA

Free format text:SECURITY AGREEMENT;ASSIGNORS:POLAROLD HOLDING COMPANY;POLAROID CORPORATION;POLAROID ASIA PACIFIC LLC;AND OTHERS;REEL/FRAME:016602/0332

Effective date:20050428

Owner name:JPMORGAN CHASE BANK,N.A,AS ADMINISTRATIVE AGENT,WI

Free format text:SECURITY INTEREST;ASSIGNORS:POLAROID HOLDING COMPANY;POLAROID CORPORATION;POLAROID ASIA PACIFIC LLC;AND OTHERS;REEL/FRAME:016602/0603

Effective date:20050428

Owner name:WILMINGTON TRUST COMPANY, AS COLLATERAL AGENT, DEL

Free format text:SECURITY AGREEMENT;ASSIGNORS:POLAROLD HOLDING COMPANY;POLAROID CORPORATION;POLAROID ASIA PACIFIC LLC;AND OTHERS;REEL/FRAME:016602/0332

Effective date:20050428

ASAssignment

Owner name:POLAROID HOLDING COMPANY, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID CORPORATION, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID CAPITAL LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID ASIA PACIFIC LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID EYEWEAR LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLOROID INTERNATIONAL HOLDING LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID INVESTMENT LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID LATIN AMERICA I CORPORATION, MASSACHUSETT

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID NEW BEDFORD REAL ESTATE LLC, MASSACHUSETT

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID NORWOOD REAL ESTATE LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID WALTHAM REAL ESTATE LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:PETTERS CONSUMER BRANDS, LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:PETTERS CONSUMER BRANDS INTERNATIONAL, LLC, MASSAC

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:ZINK INCORPORATED, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID HOLDING COMPANY,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID CORPORATION,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID CAPITAL LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID ASIA PACIFIC LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID EYEWEAR LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLOROID INTERNATIONAL HOLDING LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID INVESTMENT LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID LATIN AMERICA I CORPORATION,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID NEW BEDFORD REAL ESTATE LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID NORWOOD REAL ESTATE LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:POLAROID WALTHAM REAL ESTATE LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:PETTERS CONSUMER BRANDS, LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:PETTERS CONSUMER BRANDS INTERNATIONAL, LLC,MASSACH

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

Owner name:ZINK INCORPORATED,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WILMINGTON TRUST COMPANY;REEL/FRAME:019699/0512

Effective date:20070425

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:POLAROID HOLDING COMPANY, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID INTERNATIONAL HOLDING LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID INVESTMENT LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID LATIN AMERICA I CORPORATION, MASSACHUSETT

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID NEW BEDFORD REAL ESTATE LLC, MASSACHUSETT

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID NORWOOD REAL ESTATE LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID WALTHAM REAL ESTATE LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID CONSUMER ELECTRONICS, LLC, (FORMERLY KNOW

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID CONSUMER ELECTRONICS INTERNATIONAL, LLC,

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:ZINK INCORPORATED, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID CORPORATION, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID ASIA PACIFIC LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID CAPITAL LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:PLLAROID EYEWEAR I LLC, MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID HOLDING COMPANY,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID INTERNATIONAL HOLDING LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID INVESTMENT LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID LATIN AMERICA I CORPORATION,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID NEW BEDFORD REAL ESTATE LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID NORWOOD REAL ESTATE LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID WALTHAM REAL ESTATE LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:ZINK INCORPORATED,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID CORPORATION,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID ASIA PACIFIC LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:POLAROID CAPITAL LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

Owner name:PLLAROID EYEWEAR I LLC,MASSACHUSETTS

Free format text:RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:020733/0001

Effective date:20080225

ASAssignment

Owner name:SENSHIN CAPITAL, LLC, DELAWARE

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:POLAROID CORPORATION;REEL/FRAME:021040/0001

Effective date:20080415

Owner name:SENSHIN CAPITAL, LLC,DELAWARE

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:POLAROID CORPORATION;REEL/FRAME:021040/0001

Effective date:20080415

FEPPFee payment procedure

Free format text:PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAYFee payment

Year of fee payment:4

ASAssignment

Owner name:INTELLECTUAL VENTURES I LLC, DELAWARE

Free format text:MERGER;ASSIGNOR:SENSHIN CAPITAL, LLC;REEL/FRAME:030639/0279

Effective date:20130212

ASAssignment

Owner name:MOROOD INTERNATIONAL, SPC, SAUDI ARABIA

Free format text:SECURITY AGREEMENT;ASSIGNOR:ZINK IMAGING, INC.;REEL/FRAME:030820/0436

Effective date:20130508

ASAssignment

Owner name:IKOFIN LTD., HONG KONG

Free format text:SECURITY AGREEMENT;ASSIGNOR:ZINK IMAGING, INC.;REEL/FRAME:031746/0194

Effective date:20131118

ASAssignment

Owner name:LOPEZ, GERARD, MASSACHUSETTS

Free format text:SECURITY INTEREST;ASSIGNOR:ZINK IMAGING, INC.;REEL/FRAME:032467/0121

Effective date:20140317

Owner name:MANGROVE III INVESTMENTS SARL, LUXEMBOURG

Free format text:SECURITY INTEREST;ASSIGNOR:ZINK IMAGING, INC.;REEL/FRAME:032467/0141

Effective date:20140317

FPAYFee payment

Year of fee payment:8

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:12


[8]ページ先頭

©2009-2025 Movatter.jp