CROSS-REFERENCE TO RELATED APPLICATIONSReference is made to commonly assigned, co-pending U.S. patent application Ser. No. ______ (Docket K000702), entitled: “Digital camera system having remote control”, by Karn et al.; and to commonly assigned, co-pending U.S. patent application Ser. No. ______ (Docket K000759), entitled: “Digital camera system having multiple capture settings”, by Cucci et al., each of which is incorporated herein by reference.
FIELD OF THE INVENTIONThis invention pertains to the field of digital video cameras, and more particularly to a digital camera having a low power capture mode
BACKGROUND OF THE INVENTIONDigital capture devices, such as digital cameras and camera phones typically capture and store both still digital images and video clips. These digital capture devices typically include a color display which is used to display captured still digital images and video clips. In many situations, these digital capture devices are held by the user, who uses the color display to compose the images as they are captured. In some situations, the digital capture device is mounted on a tripod or another type of camera mounting device, so that it does not need to be held by the user. In some situations, the digital capture device is controlled using a remote control, in order to initiate and terminate the capture of images.
It is known to provide rugged digital capture devices that can be secured to various objects, such as a bike helmet or scuba mask, or mounted to the handlebars of a motorcycle or the front of a surfboard. For example, the GoPro HD Hero2 digital cameras, sold by GoPro Inc, Half Moon Bay, California are sold as part of an “Outdoor edition” package which includes various straps, pivot arms, and adhesive mounts to enable the digital camera to capture images while performing activities such as biking, skiing, skating and kayaking. However, the HD Hero2 camera includes only a single image capture system, which captures images using an optical axis directed outward from the “front” of the camera. This can cause excessive wind resistance and presents a high profile that is more susceptible to damage and image artifacts from vibrations in some situations.
It is also known to provide remote controls as accessories for digital cameras. For example, U.S. Patent Application Publication No. 2011/0058052 to Bolton, et al., entitled “Systems and methods for remote camera control” describes a portable media device (PMD) which includes a digital camera capable of capturing still images and video that can be controlled remotely using an accessory. The accessory can register with the PMD to automatically receive notifications whenever there is a change in the camera state. The camera states can include mode, operation status, and configuration settings. The accessory can send instructions to a camera application that interfaces with the camera to control the camera. The accessory can remotely activate the digital camera, change the digital camera's mode, and send instructions to operate the digital camera. The accessory and the PMD can concurrently control the camera. The PMD can send the captured still images and recorded video to the accessory for preview and can receive instructions from the accessory. Unfortunately, because the accessory receives notifications whenever there is a change in the camera state, power must be continuously supplied to ensure that a notification can be received by the accessory. This can rapidly deplete the batteries which control the accessory.
It is also known to provide a video camera having two lenses pointing in perpendicular directions, as described in U.S. Pat. No. 6,288,742 to Ansari et al., entitled “Video Camera Including Multiple Image Sensors.” This patent describes a digital motion camera useful in teleconferencing which includes two lenses and two image sensors. The first lens is used to provide a relatively wide angle view of a room and the second lens is used to provide high resolution document transmission capability. During a video telephone conference, the camera permits fast switching between an image of the room as seen through the first lens or an image of a document as seen through the second lens, without the need for pan and tilt stages or a plurality of complete camera units. However, this camera is always mounted in the same orientation, regardless of which lens is used to capture images. The camera does not include multiple camera mounts to enable the camera to be mounted in different orientations when the second lens is used to capture images.
It is also known to provide a camera carrying case that includes more than one tripod screw socket on different sides of the cases, as described in U.S. Pat. No. 1,258,437 “Camera carrying case” to Nord. However, the case is designed for a camera having a single lens with a single optical axis. The two tripod screw sockets are used to capture landscape and portrait orientation images in the direction of this single optical axis.
Thus, there remains a need to provide a digital camera that can be used in a “conventional” capture mode, where the digital camera is held by the user while capturing digital images, and which can also be used in “streamlined” mounted mode, which provides a lower profile and reduced wind resistance when the digital camera captures images while mounted to moving object such as a bicycle.
SUMMARY OF THE INVENTIONA digital camera system providing a low-power image capture mode, comprising:
a first image capture system including:
- an image sensor for capturing a digital image; and
- an optical system for forming an image of a scene onto the image sensor;
an image display;
a power management system providing a normal image capture mode wherein captured digital images are displayed on the image display as they are captured and a low-power image capture mode wherein captured digital images are not displayed on the image display as they are captured;
a user interface including a plurality of user controls, including a first user control for selecting between the normal image capture mode and the low-power image capture mode, and a second user control for initiating a video capture operation;
a data processing system;
a storage memory for storing captured images; and
a program memory communicatively connected to the data processing system and storing instructions configured to cause the data processing system to implement a method for capturing digital images, wherein the method includes:
- setting the digital camera system to operate in either the normal image capture mode or the low-power image capture mode in response to user activation of the first user control;
- initiating a video capture operation in response to user activation of the second user control; and
- capturing a sequence of digital images and recording the sequence of digital images in the storage memory;
- wherein if the digital camera system is set to operate in the normal image capture mode the sequence of captured digital images is displayed on the image display as it is captured, and if the digital camera system is set to operate in the low-power image capture mode the sequence of captured digital images is not displayed on the image display as it is captured, and wherein if the first user control is activated while the sequence of digital images is being captured, the power management system switches between the normal image capture mode and the low-power image capture mode without interrupting the video capture operation.
The present invention has the advantage that a reduced power mode is provided for use when the digital camera system is mounted in a configuration where the image display cannot be viewed by the user.
It has the additional advantage that the user can enter the reduced power mode without interrupting a video capture operation.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is a high-level diagram showing the components of a digital camera including two image capture systems;
FIG. 2 is a flow diagram depicting typical image processing operations used to process digital images in the digital camera ofFIG. 1;
FIGS. 3A-3C is a drawing depicting different views of a digital camera in accordance with an embodiment of the present invention.
FIG. 4A is a drawing depicting the digital camera ofFIGS. 3A-3C mounted using a helmet mount.
FIG. 4B is a drawing depicting the helmet mount clip fromFIG. 4A.
FIG. 4C is a drawing depicting the helmet mount stud fromFIG. 4A.
FIG. 5A is a drawing depicting a bar mount for a digital camera.
FIG. 5B is an exploded view depicting the components of the bar mount ofFIG. 5A.
FIG. 6 is a flowchart showing steps for controlling a digital camera having a low-power image capture mode;
FIG. 7A is a high-level diagram showing the components of a remote control module in accordance with the present invention;
FIG. 7B is a drawing depicting a front view of the remote control module ofFIG. 7A; and
FIG. 8 is a flowchart showing steps for managing the power in a digital camera system including a remote control module.
It is to be understood that the attached drawings are for purposes of illustrating the concepts of the invention and may not be to scale.
DETAILED DESCRIPTION OF THE INVENTIONIn the following description, a preferred embodiment of the present invention will be described in terms that would ordinarily be implemented as a software program. Those skilled in the art will readily recognize that the equivalent of such software can also be constructed in hardware. Because image manipulation algorithms and systems are well known, the present description will be directed in particular to algorithms and systems forming part of, or cooperating more directly with, the system and method in accordance with the present invention. Other aspects of such algorithms and systems, and hardware or software for producing and otherwise processing the image signals involved therewith, not specifically shown or described herein, can be selected from such systems, algorithms, components and elements known in the art. Given the system as described according to the invention in the following materials, software not specifically shown, suggested or described herein that is useful for implementation of the invention is conventional and within the ordinary skill in such arts.
Still further, as used herein, a computer program for performing the method of the present invention can be stored in a non-transitory, tangible computer readable storage medium, which can include, for example; magnetic storage media such as a magnetic disk (such as a hard drive or a floppy disk) or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable bar code; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program having instructions for controlling one or more computers to practice the method according to the present invention.
Because digital cameras employing imaging devices and related circuitry for signal capture and processing, and display are well known, the present description will be directed in particular to elements forming part of, or cooperating more directly with, the method and apparatus in accordance with the present invention. Elements not specifically shown or described herein are selected from those known in the art. Certain aspects of the embodiments to be described are provided in software. Given the system as shown and described according to the invention in the following materials, software not specifically shown, described or suggested herein that is useful for implementation of the invention is conventional and within the ordinary skill in such arts.
The invention is inclusive of combinations of the embodiments described herein. References to “a particular embodiment” and the like refer to features that are present in at least one embodiment of the invention. Separate references to “an embodiment” or “particular embodiments” or the like do not necessarily refer to the same embodiment or embodiments; however, such embodiments are not mutually exclusive, unless so indicated or as are readily apparent to one of skill in the art. The use of singular or plural in referring to the “method” or “methods” and the like is not limiting. It should be noted that, unless otherwise explicitly noted or required by context, the word “or” is used in this disclosure in a non-exclusive sense.
The following description of a digital camera will be familiar to one skilled in the art. It will be obvious that there are many variations of this embodiment that are possible and are selected to reduce the cost, add features or improve the performance of the camera.
FIG. 1 depicts a block diagram of a digital photography system, including adigital camera10. Preferably, thedigital camera10 is a portable battery operated device, small enough to be easily handheld by a user when capturing and reviewing images, as will be described later in reference toFIGS. 3A-3C. Thedigital camera10 produces digital images that are stored as digital image files usingimage memory30. The phrase “digital image” or “digital image file,” as used herein, refers to any digital image file, such as a digital still image or a digital video file.
In some embodiments, thedigital camera10 captures both motion video images and still images. In some embodiments, thedigital camera10 can also be used to capture burst image sequences or time-lapse image sequences, where a plurality of digital images are captured at predefined or selectable time intervals. Thedigital camera10 can also include other functions, including, but not limited to, the functions of a digital music player (e.g. an MP3 player), a mobile telephone, a GPS receiver, or a programmable digital assistant (PDA).
In some embodiments, thedigital camera10 includes a firstimage capture system1A and a secondimage capture system1B. The firstimage capture system1A includes afirst image sensor14A and a first optical system comprisingfirst lens4A for forming an image of a scene (not shown) onto thefirst image sensor14A, for example, a single-chip color CCD or CMOS image sensor. The firstimage capture system1A has an optical axis A directed outward from the front of thefirst lens4A. In some embodiments, thefirst lens4A is a fixed focal length, fixed focus lens. In other embodiments, thefirst lens4A is a zoom lens having a focus control and is controlled by zoom and focus motors or actuators (not shown). In some embodiments, thefirst lens4A has a fixed lens aperture, and in other embodiments the lens aperture is controlled by a motor or actuator (not shown). The output of thefirst image sensor14A is converted to digital form by Analog Signal Processor (ASP) and Analog-to-Digital (A/D)converter16A, and the digital data is provided to a multiplexer (MUX)17.
In a preferred embodiment, the secondimage capture system1B includes asecond image sensor14B and a second optical system comprising asecond lens4B for forming an image of a scene (not shown) onto thesecond image sensor14B, for example, a single-chip color CCD or CMOS image sensor. The secondimage capture system1B has an optical axis B directed outward from the front of thesecond lens4B. In some embodiments, thesecond lens4B has the same focal length as thefirst lens4A. In other embodiments, thesecond lens4B has a different focal length (or a different focal length range if thefirst lens4A and thesecond lens4B are zoom lens). Thesecond lens4B can have a fixed lens aperture, or can have an adjustable aperture controlled by a motor or actuator (not shown). The output of thesecond image sensor14B is converted to digital form by Analog Signal Processor (ASP) and Analog-to-Digital (A/D)converter16B, and the digital data is provided to themultiplexer17.
In other embodiments, the secondimage capture system1B may use some or all of the same components as the firstimage capture system1A. For example, thefirst image sensor14A can be used for both the first and secondimage capture systems1A and1B, and a pivoting mirror can be used to direct light from thefirst lens4A or thesecond lens4B onto thefirst image sensor14A.
Themultiplexer17 provides either the output of ASP and A/D converter16A or the output of ASP and A/D converter16B to abuffer memory18, which stores the image data from either the firstimage capture system1A or the secondimage capture system1B. The image data stored inbuffer memory18 is subsequently manipulated by aprocessor20, using embedded software programs (e.g., firmware) stored infirmware memory28. Theprocessor20 controls themultiplexer17 in response to user inputs provided usinguser controls34 in order to determine whether the firstimage capture system1A or the secondimage capture system1B is used to capture images.
In some embodiments, the software program is permanently stored infirmware memory28 using a read only memory (ROM). In other embodiments, thefirmware memory28 can be modified by using, for example, Flash EPROM memory. In such embodiments, an external device can update the software programs stored infirmware memory28 using a wiredinterface38 or awireless modem50. In such embodiments, thefirmware memory28 can also be used to store image sensor calibration data, user setting selections and other data which must be preserved when the camera is turned off. In some embodiments, theprocessor20 includes a program memory (not shown), and the software programs stored in thefirmware memory28 are copied into the program memory before being executed by theprocessor20.
It will be understood that the functions ofprocessor20 can be provided using a single programmable processor or by using multiple programmable processors, including one or more digital signal processor (DSP) devices. Alternatively, theprocessor20 can be provided by custom circuitry (e.g., by one or more custom integrated circuits (ICs) designed specifically for use in digital cameras), or by a combination of programmable processor(s) and custom circuits. It will be understood that connectors between theprocessor20 from some or all of the various components shown inFIG. 1 can be made using a common data bus. For example, in some embodiments the connection between theprocessor20, thebuffer memory18, theimage memory30, and thefirmware memory28 can be made using a common data bus.
The processed images are then stored using theimage memory30. It is understood that theimage memory30 can be any form of memory known to those skilled in the art including, but not limited to, a removable Flash memory card, internal Flash memory chips, magnetic memory, or optical memory. In some embodiments, theimage memory30 can include both internal Flash memory chips and a standard interface to a removable Flash memory card, such as a Secure Digital (SD) card. Alternatively, a different memory card format can be used, such as a micro SD card, Compact Flash (CF) card, MultiMedia Card (MMC), xD card or Memory Stick.
Thefirst image sensor14A and thesecond image sensor14B are controlled by atiming generator12, which produces various clocking signals to select rows and pixels and synchronizes the operation of the ASP and A/D converters16A and16B. Thefirst image sensor14A can have, for example, 12.4 megapixels (e.g., 4088×3040 pixels) in order to provide a still image file of approximately 4000×3000 pixels. To provide a color image, the image sensor is generally overlaid with a color filter array, which provides an image sensor having an array of pixels that include different colored pixels. The different color pixels can be arranged in many different patterns. As one example, the different color pixels can be arranged using the well-known Bayer color filter array, as described in commonly assigned U.S. Pat. No. 3,971,065, entitled “Color imaging array,” to Bayer, the disclosure of which is incorporated herein by reference. As a second example, the different color pixels can be arranged as described in commonly assigned U.S. Patent Application Publication No. 2007/0024931 to Compton and Hamilton, entitled “Image sensor with improved light sensitivity,” the disclosure of which is incorporated herein by reference. These examples are not limiting, and many other color patterns may be used. Thesecond image sensor14B can have the same number of pixels as thefirst image sensor14A, or can have a different number of pixels.
It will be understood that thefirst image sensor14A, thetiming generator12, and ASP and A/D converter16A can be separately fabricated integrated circuits, or they can be fabricated as a single integrated circuit as is commonly done with CMOS image sensors. In some embodiments, this single integrated circuit can perform some of the other functions shown inFIG. 1, including some of the functions provided byprocessor20.
When selected by themultiplexer17, thefirst image sensor14A or thesecond image sensor14B are effective when actuated in a first mode by timinggenerator12 for providing a motion sequence of lower resolution sensor image data, which is used when capturing video images and also when previewing a still image to be captured, in order to compose the image. This preview mode sensor image data can be provided as HD resolution image data, for example, with 1280×720 pixels, or as VGA resolution image data, for example, with 640×480 pixels, or using other resolutions which have significantly columns and rows of data, compared to the resolution of the image sensor.
The preview mode sensor image data can be provided by combining values of adjacent pixels having the same color, or by eliminating some of the pixels values, or by combining some color pixels values while eliminating other color pixel values. The preview mode image data can be processed as described in commonly assigned U.S. Pat. No. 6,292,218 to Parulski et al., entitled “Electronic camera for initiating capture of still images while previewing motion images,” which is incorporated herein by reference.
Thefirst image sensor14A and thesecond image sensor14B are also effective when actuated in a second mode by timinggenerator12 for providing high resolution still image data. This final mode sensor image data is provided as high resolution output image data, which for scenes having a high illumination level includes all of the pixels of the image sensor, and can be, for example, a 12 megapixel final image data having 4000×3000 pixels. At lower illumination levels, the final sensor image data can be provided by “binning” some number of like-colored pixels on the image sensor, in order to increase the signal level and thus the “ISO speed” of the sensor.
The exposure level is controlled by controlling the exposure periods of thefirst image sensor14A and thesecond image sensor14B via thetiming generator12, and the gain (i.e., ISO speed) setting of the ASP and A/D converters16A and16B. In some embodiments, theprocessor20 also controls one or more illumination systems (not shown), such as a flash unit or an LED, which are used to selectively illuminate the scene in the direction of optical axis A or optical axis B, to provide sufficient illumination under low light conditions.
In some embodiments, thefirst lens4A and thesecond lens4B of thedigital camera10 can be focused in the first mode by using “through-the-lens” autofocus, as described in commonly-assigned U.S. Pat. No. 5,668,597, entitled “Electronic Camera with Rapid Automatic Focus of an Image upon a Progressive Scan Image Sensor” to Parulski et al., which is incorporated herein by reference. This is accomplished by using the zoom and focus motor drivers (not shown) to adjust the focus position of thefirst lens4A or thesecond lens4B to a number of positions ranging between a near focus position to an infinity focus position, while theprocessor20 determines the closest focus position which provides a peak sharpness value for a central portion of the image captured by the correspondingfirst image sensor14A orsecond image sensor14B. The focus distance can be stored as metadata in the image file, along with other lens and camera settings.
Theprocessor20 produces menus and low resolution color images that are temporarily stored indisplay memory36 and are displayed onimage display32. Theimage display32 is typically an active matrix color liquid crystal display (LCD), although other types of displays, such as organic light emitting diode (OLED) displays, can be used. In some embodiments, thedisplay32 may be detachable from the main body of thedigital camera10, or can be on a separate unit. Avideo interface44 provides a video output signal from thedigital camera10 to avideo display46, such as a flat panel HDTV display. In preview mode, or video mode, the digital image data frombuffer memory18 is manipulated byprocessor20 to form a series of motion preview images that are displayed, typically as color images, on theimage display32. In review mode, the images displayed on theimage display32 are produced using the image data from the digital image files stored inimage memory30.
The graphical user interface displayed on theimage display32 includes various user control elements which can be selected by user controls34. The user controls34 are used to select the firstimage capture system1A or the secondimage capture system1B, to select various camera modes, such as video capture mode, still capture mode, and review mode, and to initiate capture of still images and the recording of motion images. The user controls34 are also used to turn on the camera and initiate the image/video capture process. User controls34 typically include some combination of buttons, rocker switches, joysticks, or rotary dials. In some embodiments, some of the user controls34 are provided by using a touch screen overlay on theimage display32 having one or more touch-sensitive user control elements.
Anaudio codec22 connected to theprocessor20 receives an audio signal from amicrophone24 and provides an audio signal to aspeaker26. These components can be to record and playback an audio track, along with a video sequence or still image. If thedigital camera10 is a multi-function device such as a combination camera and mobile phone, themicrophone24 and thespeaker26 can also be used for other purposes such as telephone conversation. In some embodiments,microphone24 is capable of recording sounds in air and also in an underwater environment when thedigital camera10 is used to record underwater images. In other embodiments, thedigital camera10 includes both a conventional air microphone as well as an underwater microphone (hydrophone) capable of recording underwater sounds.
In some embodiments, thespeaker26 can be used as part of the user interface, for example to provide various audible signals which indicate that a user control has been depressed, or that a particular mode has been selected. In some embodiments, themicrophone24, theaudio codec22, and theprocessor20 can be used to provide voice recognition, so that the user can provide a user input to theprocessor20 by using voice commands, rather than user controls34. Thespeaker26 can also be used to inform the user of an incoming phone call. This can be done using a standard ring tone stored infirmware memory28, or by using a custom ring-tone downloaded from awireless network52 and stored in theimage memory30. In addition, a vibration device (not shown) can be used to provide a silent (e.g., non audible) notification of an incoming phone call.
Theprocessor20 also provides additional processing of the image data from the image sensor14, in order to produce rendered sRGB still image data which is compressed and stored within a “finished” image file, such as a well-known Exif-JPEG still image file, in theimage memory30 and also to produce rendered video image data which is compressed and stored within a digital video file, such as the well-known H.264 video image file.
Thedigital camera10 can be connected via the wiredinterface38 to an interface/recharger48, which is connected to a computer40, which can be a desktop computer or portable computer located in a home or office. Thewired interface38 can conform to, for example, the well-known USB 2.0 interface specification. The interface/recharger48 can provide power via the wiredinterface38 to recharge a set ofcamera batteries43 which supply power to acamera power manager42 in thedigital camera10.
Thecamera power manager42 provides both a normal image capture mode and a low-power image capture mode. In the normal image capture mode, power is supplied to theimage display32 as images are captured, since the viewer is typically using theimage display32 to compose the captured images while holding thedigital camera10. In the low-power image capture mode, power is not supplied to theimage display32 in order to conserve battery power by not displaying images on theimage display32. Since thedigital camera10 is typically mounted (e.g. to a bike or another moving device) when the low-power image capture mode is used, the user is not in a position to view theimage display32, so providing images to theimage display32 is wasteful.
Thedigital camera10 includes awireless modem50, which communicates with aremote control module200 over awireless network52. Thewireless modem50 can use various wireless interface protocols, such as the well-known Bluetooth wireless interface or the well-known 802.11 wireless interface, or various proprietary protocols. In some embodiments, thedigital camera10 can communicate over thewireless network52 with a wireless modem (not shown) in computer40, in order to transfer captured digital images to the computer40. In some embodiments, thedigital camera10 can transfer images (still or video) to awireless access point74 in order communicate via theInternet70 with aservice provider72, such as Facebook, Flickr, YouTube or the Kodak EasyShare Gallery, to transfer images. Other devices (not shown) can access the images stored by theservice provider72 via theInternet70, including the computer40.
In alternative embodiments, thewireless modem50 communicates over a radio frequency (e.g., wireless) link with a mobile phone network (not shown), such as a 3GSM network, which connects with theInternet70 in order to upload digital image files from thedigital camera10. These digital image files can be provided to the computer40 or theservice provider72.
In some embodiments, thedigital camera10 is a water proof digital camera capable of being used to capture digital images underwater and under other challenging environmental conditions, such as in rain or snow conditions. For example, thedigital camera10 can be used by scuba divers exploring a coral reef or by children playing at a beach. To prevent damage to the various camera components, in these embodiments thedigital camera10 includes a watertight housing (not shown).
FIG. 2 is a flow diagram depicting image processing operations that can be performed by the processor20 (FIG. 1) in the digital camera10 (FIG. 1) in order to processcolor sensor data100 from thefirst image sensor14A output by the ASP and A/D converter16A or from thesecond image sensor14B output by the ASP and A/D converter16B. In some embodiments, the processing parameters used by theprocessor20 to manipulate thecolor sensor data100 for a particular digital image are determined byvarious user settings175, which are typically associated with photography modes that can be selected via the user controls34 (FIG. 1), which enable the user to adjustvarious camera settings185 in response to menus displayed on the image display32 (FIG. 1). In a preferred embodiment, the user control elements available in the menus are adjusted responsive to sensed environmental conditions.
Thecolor sensor data100 which has been digitally converted by the ASP and A/D converter16A or the ASP and A/D converter16B is manipulated by awhite balance step95. In some embodiments, this processing can be performed using the methods described in commonly-assigned U.S. Pat. No. 7,542,077 to Miki, entitled “White balance adjustment device and color identification device,” the disclosure of which is herein incorporated by reference. The white balance can be adjusted in response to a white balance setting90, which can be manually set by a user, or can be automatically set to different values when the camera is used in different environmental conditions.
The color image data is then manipulated by anoise reduction step105 in order to reduce noise from thefirst image sensor14A or thesecond image sensor14B. In some embodiments, this processing can be performed using the methods described in commonly-assigned U.S. Pat. No. 6,934,056 to Gindele et al., entitled “Noise cleaning and interpolating sparsely populated color digital image using a variable noise cleaning kernel,” the disclosure of which is herein incorporated by reference. In some embodiments, the level of noise reduction can be adjusted in response to an ISO setting110, so that more filtering is performed at higher ISO exposure index setting.
The color image data is then manipulated by ademosaicing step115, in order to provide red, green and blue (RGB) image data values at each pixel location. Algorithms for performing thedemosaicing step115 are commonly known as color filter array (CFA) interpolation algorithms or “deBayering” algorithms. In some embodiments of the present invention, thedemosaicing step115 can use the luminance CFA interpolation method described in commonly-assigned U.S. Pat. No. 5,652,621, entitled “Adaptive color plane interpolation in single sensor color electronic camera,” to Adams et al., the disclosure of which is incorporated herein by reference. Thedemosaicing step115 can also use the chrominance CFA interpolation method described in commonly-assigned U.S. Pat. No. 4,642,678, entitled “Signal processing method and apparatus for producing interpolated chrominance values in a sampled color image signal,” to Cok, the disclosure of which is herein incorporated by reference.
In some embodiments, the user can select between different pixel resolution modes, so that the digital camera can produce a smaller size image file. Multiple pixel resolutions can be provided as described in commonly-assigned U.S. Pat. No. 5,493,335, entitled “Single sensor color camera with user selectable image record size,” to Parulski et al., the disclosure of which is herein incorporated by reference. In some embodiments, a resolution mode setting120 can be selected by the user to be full size (e.g., 3,000×2,000 pixels), medium size (e.g., 1,500×1000 pixels) or small size (e.g., 750×500 pixels).
The color image data is color corrected incolor correction step125. In some embodiments, the color correction is provided using a 3×3 linear space color correction matrix, as described in commonly-assigned U.S. Pat. No. 5,189,511, entitled “Method and apparatus for improving the color rendition of hardcopy images from electronic cameras” to Parulski, et al., the disclosure of which is incorporated herein by reference. In some embodiments, different user-selectable color modes can be provided by storing different color matrix coefficients infirmware memory28 of thedigital camera10. For example, four different color modes can be provided, so that the color mode setting130 is used to select one of the following color correction matrices:
The color image data is also manipulated by a tonescale correction step135. In some embodiments, the tonescale correction step135 can be performed using a one-dimensional look-up table as described in U.S. Pat. No. 5,189,511, cited earlier. In some embodiments, a plurality of tone scale correction look-up tables is stored in thefirmware memory28 in thedigital camera10. These can include look-up tables which provide a “normal” tone scale correction curve, a “high contrast” tone scale correction curve, and a “low contrast” tone scale correction curve. A user selected contrast setting140 is used by theprocessor20 to determine which of the tone scale correction look-up tables to use when performing the tonescale correction step135.
The color image data is also manipulated by animage sharpening step145. In some embodiments, this can be provided using the methods described in commonly-assigned U.S. Pat. No. 6,192,162 entitled “Edge enhancing colored digital images” to Hamilton, et al., the disclosure of which is incorporated herein by reference. In some embodiments, the user can select between various sharpening settings, including a “normal sharpness” setting, a “high sharpness” setting, and a “low sharpness” setting. In this example, theprocessor20 uses one of three different edge boost multiplier values, for example 2.0 for “high sharpness,” 1.0 for “normal sharpness,” and 0.5 for “low sharpness” levels, responsive to a sharpening setting150 selected by the user of thedigital camera10. In some embodiments, different image sharpening algorithms can be manually or automatically selected, depending on the environmental condition.
The color image data is also manipulated by animage compression step155. In some embodiments, theimage compression step155 can be provided using the methods described in commonly-assigned U.S. Pat. No. 4,774,574, entitled “Adaptive block transform image coding method and apparatus” to Daly et al., the disclosure of which is incorporated herein by reference. In some embodiments, the user can select between various compression settings. This can be implemented by storing a plurality of quantization tables, for example, three different tables, in thefirmware memory28 of thedigital camera10. These tables provide different quality levels and average file sizes for the compresseddigital image file180 to be stored in theimage memory30 of thedigital camera10. A user selected compression mode setting160 is used by theprocessor20 to select the particular quantization table to be used for theimage compression step155 for a particular image.
The compressed color image data is stored in adigital image file180 using afile formatting step165. The image file can includevarious metadata170.Metadata170 is any type of information that relates to the digital image, such as the model of the camera that captured the image, the size of the image, the date and time the image was captured, and various camera settings, such as the lens focal length, the exposure time and F/# of the lens, and whether or not the camera flash fired. In some embodiments, themetadata170 can also include one or moreenvironmental readings190 provided by appropriate environmental sensors associated with thedigital camera10. For example, an underwater sensor (not shown) can be used to provide an environmental reading indicating whether thedigital camera10 is being operated underwater. Similarly, a Global Positioning System (GPS) sensor (not shown) can be used to provide an environmental reading indicating a geographical location, or an inertial motion sensor such as a gyroscope or an accelerometer can be used to provide an environmental reading indicating a camera motion or orientation. In a preferred embodiment, all of thismetadata170 is stored using standardized tags within the well-known Exif-JPEG still image file or within the H.264 video image file.
FIGS. 3A-3C are drawings which depict thecamera body400 of thedigital camera10.FIG. 3A is a drawing depicting a rear view of thecamera body400,FIG. 3B is a drawing depicting a front and top view of thecamera body400, andFIG. 3C is a drawing depicting a rear and bottom view of thecamera body400.
Thecamera body400 of thedigital camera10 includes afirst surface410 having animage display32, as shown inFIG. 3A. Theimage display32 is used for displaying captured digital images, as described earlier in reference toFIG. 1.
Thecamera body400 of thedigital camera10 also includes asecond surface420, opposite to thefirst surface410, as shown inFIG. 3B. The firstimage capture system1A (FIG. 1), which includes thefirst image sensor14A (FIG. 1) and thefirst lens4A that forms an image of a scene onto thefirst image sensor14A (FIG. 1), has an optical axis A directed outward from thesecond surface420.
Thecamera body400 of thedigital camera10 also includes athird surface430 transverse to thefirst surface410 and thesecond surface420. Thethird surface430 has a smaller surface area than the first surface410 (and likewise the second surface420). Generally, the surface area of the third surface should be less than 40% of the surface area of thefirst surface410. Preferably, the surface area of the third surface is between 5% and 20% of the surface area of thefirst surface410. The secondimage capture system1B (FIG. 1), which includes thesecond image sensor14B (FIG. 1) and thesecond lens4B that forms an image of a scene onto thesecond image sensor14B (FIG. 1), has an optical axis B directed outward from thethird surface430.
Thecamera body400 of thedigital camera10 also includes afourth surface440 opposite to thethird surface430. Afirst camera mount415 is positioned on thefourth surface440 to facilitate thecamera body400 being mounted to a support (as will be described later in reference toFIGS. 4 and 5) such that the first optical axis A is oriented in a substantially horizontal direction. In other embodiments, thefirst camera mount415 can alternatively be positioned on thethird surface430.
Asecond camera mount425 is positioned on thesecond surface420 to facilitate thecamera body400 being mounted to a support such that the second optical axis B is oriented in a substantially horizontal direction. In other embodiments, thesecond camera mount425 can alternatively be positioned on thefirst surface410.
The smaller surface area of thethird surface430 provides a lower profile when thecamera body400 is positioned such that the optical axis B is oriented in a substantially horizontal direction. This is advantageous for applications where thedigital camera10 is used in a situation where the user is in motion, such as when it is mounted to a user's helmet while they are skiing, or when it is mounted to a bike handlebar. The lower profile provides a reduced wind resistance and a reduced risk of damage (e.g., due to interference with overhanging branches) The reduced wind resistance has the additional advantage that it provides reduced wind noise in the audio tracks of captured videos. Preferably thecamera body400 has a streamlined profile having rounded edges to further reduce wind resistance. Thecamera body400 also has a lower center of gravity in this orientation. The reduced center of gravity is advantageous for reducing vibrations.
In some embodiments, thefirst lens4A and thesecond lens4B have different focal lengths for capturing different fields-of-view of the scene. Likewise, thefirst image sensor14A and thesecond image sensor14B can have different resolutions (i.e., different numbers of light-sensitive image pixels) and quality levels. For example, the firstimage capture system1A with thefirst lens4A andfirst image sensor14A will be more likely to be used in a hand-held still photography mode where a high-resolution, high-quality image sensor is of great importance. Similarly, the secondimage capture system1B with thesecond lens4B andsecond image sensor14B will be more likely to be used in an action video capture mode where a wide-angle lens having a wider field of view is generally desirable and where a high resolution/quality image sensor is not as critical. The wider field-of-view has the advantage that it captures a larger portion of the scene which is generally preferred during action shots, and is also less sensitive to image stability problems. The use of a lower resolution/quality sensor has the advantage that it will typically have a lower cost, and can also have a smaller physical size (which is desirable for mechanical design considerations), while still providing adequate image quality for capturing a good-quality HD video.
In some embodiments, thefirst camera mount415 and thesecond camera mount425 are tripod mounting screws conforming to the well-known international standard ISO 1222:2010, Photography-Tripod connections, which is available from the International Organization for Standardization, Geneva, Switzerland. In other embodiments, thefirst camera mount415 or thesecond camera mount425 can use other types of mounting interfaces, including proprietary custom interfaces using connection means such as screws, pins, clips, latches or magnets.
Thecamera body400 of thedigital camera10 provides a camera user interface including an image path control401 for selecting between the firstimage capture system1A and the secondimage capture system1B. In some embodiments, the image path control401 can also be used to select an image capture mode where both the firstimage capture system1A and the secondimage capture system1B are simultaneously used to capture images. Acapture operation control402 is also provided for initiating an image capture operation using the selected firstimage capture system1A or secondimage capture system1B, and apower control403 which enables the user to turn thedigital camera10 off and on. In some embodiments, the image path control401 enables the user to select a low power mode, and in other embodiments, thepower control403 enables the user to select a low power mode, as will be described later in reference toFIG. 6.
In some embodiments, when the image path control401 is used to select between the firstimage capture system1A and the secondimage capture system1B, various camera settings can be adjusted accordingly. For example, a different default image capture mode can be automatically selected in each case. In some embodiments, when the user selects a particular image capture system, the camera settings are set to the values that the user had selected the last time that thedigital camera10 had been set to use that image capture system. This enables the user to define different default settings for the firstimage capture system1A and the secondimage capture system1B without needing to manually reset them each time that the image capture system is changed.
Thecamera body400 of thedigital camera10 includes a memorycard access door444 for accessing aremovable memory card442. Theremovable memory card442 provides the image memory30 (shown inFIG. 1) which is used as a storage memory for storing digital images captured using the selected firstimage capture system1A or the secondimage capture system1B. Thecamera body400 of thedigital camera10 includes aconnector access door446 that can be used to access various connectors such as a power cable connector or a USB cable connector.
FIG. 4A is a drawing depicting thecamera body400 of thedigital camera10 mounted using ahelmet mounting clip460 which is attached to the second camera mount425 (FIG. 3B) on the second surface420 (FIG. 3B) of thecamera body400 using aquick release tab450.
FIG. 4B is a drawing depicting thehelmet mounting clip460. Thehelmet mounting clip460 can be attached to a protective helmet (not shown), such as a bike helmet, motorcycle helmet, skate board helmet, skydiving helmet, or ski helmet, using Velcro, double-sided tape, or a strap (not shown). Thehelmet mounting clip460 includes aslot462 into which thequick release tab450 can slide. While thehelmet mounting clip460 is nominally adapted for mounting thedigital camera10 to a helmet, it should be noted that thehelmet mounting clip460 can be attached to many other types of objects as well, such as a surfboard or a car bumper.
FIG. 4C is a drawing depicting thequick release tab450. Ascrew452 is used to secure thequick release tab450 to thesecond camera mount425 on the second surface420 (or thefirst camera mount415 on the fourth surface440) of thecamera body400. Theedge portion454 of thequick release tab450 has a reduced thickness, relative to the thickness of acentral portion456 of thequick release tab450, to enable thequick release tab450 to be inserted in theslot462 of thehelmet mounting clip460, or into a bar mount, which will be described later relative toFIGS. 5A-5B.
If the low-power mode test510 determines that thedigital camera10 is in the low-power image capture mode, the captured images are not displayed on theimage display32 in order to reduce the power consumption, and the process proceeds to the record capturedimages step525. This is appropriate, for example, when thedigital camera10 is mounted to a user's bike helmet while capturing a still image or a video clip, since, in this case, the user is unable to view theimage display32.
If the low-power mode test510 determines that thedigital camera10 is in the low-power image capture mode, the captured images are not displayed on theimage display32 in order to reduce the power consumption, and the process proceeds to the record capturedimages step525. This is appropriate, for example, when thedigital camera10 is mounted to a user's bike helmet while capturing a still image or a video clip, since, in this case, the user is unable to view theimage display32.
FIG. 5A is a drawing depicting abar mount470 for use to attach the camera body400 (FIG. 3A) of thedigital camera10 to abar474. Thebar474 can be, for example, the handlebar of a bike or a motorcycle, or can be a ski pole, roof rack pole, or the mast of a sailboat or windsurfer. In some embodiments, thebar mount470 is attached to thebar474 usingstraps476. In other embodiments, the bar mount can be attached using some other mounting mechanism such as cable ties or bolts.
FIG. 5B is an exploded view depicting the components of thebar mount470. Thebar mount470 includes amount rail480 which includes aslot482 into which the quick release tab450 (FIG. 4C) can slide. Thebar mount470 also includes amount base490. In a preferred embodiment, the lower surface of the mount base includes elastomar strips (not shown) for gripping the bar474 (FIG. 5A). Thebar mount470 is secured to thebar474 using straps476 (FIG. 5A) or some other mounting mechanism.
Themount rail480 is attached to themount base490 using ascrew495, awasher494, and aspring493. Thespring493 enables themount rail480 to be lifted and then rotated relative to themount base490 in the direction generally shown byarrow484. This enable themount rail480 to be positioned above themount base490 into one of 16 detent positions, corresponding to the positions of the 16holes492.
FIG. 6 is a flowchart showing steps for controlling the digital camera10 (FIGS. 3A-3C) according to a normal image capture mode and a low-power image capture mode. In setcapture mode step500, thedigital camera10 is set to operate in either the normal image capture mode or the low-power image capture mode.
In some embodiments, the image capture mode is set in response to user activation of the image path control401 (FIG. 3B), which also selects the firstimage capture system1A or the secondimage capture system1B (FIG. 1). In such embodiments, when the firstimage capture system1A is selected, the normal image capture mode is preferably used and when the secondimage capture system1B is selected, the low power image capture mode is preferably used. The processor20 (FIG. 1) in thedigital camera10 responds to the user activation of the image path control401 to select the firstimage capture system1A by setting the mode of the camera power manager42 (FIG. 1) to be in the normal image capture mode and setting themultiplexer17 to output the digital image data from ASP and A/D converter16A. Theprocessor20 responds to the user activation of the image path control401 to select the secondimage capture system1B by setting the mode of thecamera power manager42 to be in the low power image capture mode and setting themultiplexer17 to output the digital image data from ASP and A/D converter16B.
In some other embodiments, the power control403 (FIG. 3B) is used to select the low power image capture mode, rather than using theimage path control401. For example, thepower control403 is first used to turn on thedigital camera10. The user then uses the image path control401 to select either the firstimage capture system1A or the secondimage capture system1B. The user can then mount thedigital camera10 to their bike helmet, before placing the bike helmet on their head, as described earlier in reference toFIG. 4A. The user can then press and release thepower control403 in order to place thedigital camera10 in the low power mode. Finally, the user can place the helmet on their head and use the remote control module200 (FIG. 1) to initiate image capture operations.
In initiatecapture operation step505, the processor20 (FIG. 1) initiates an image capture operation in response to user activation of an appropriate user control. In some embodiments, the user control is the capture operation control402 (FIG. 3A). In other embodiments, the user control is included in the remote control module200 (FIG. 1), which will be described later in reference toFIGS. 7A-B. Theprocessor20 initiates the image capture operation by beginning the capture of a digital video (or a burst image sequence or a time-lapse image sequence), or capturing a digital still image, as described earlier in reference toFIGS. 1 and 2.
In low-power mode test510, theprocessor20 determines whether the camera power manager42 (FIG. 1) has been set to the low-power image capture mode. If the low-power mode test510 determines that thedigital camera10 is not in the low-power image capture mode (i.e., it is in the normal image capture mode), a display captured images step515 is used to display the captured digital images on the image display32 (FIG. 1). This is appropriate, for example, when the user is hand-holding thedigital camera10 while capturing a video clip.
In record capturedimages step525, the captured digital video images or digital still images are recorded in the image memory30 (FIG. 1). Theimage memory30 can be theremovable memory card442 described earlier in reference toFIG. 3B.
If the low-power mode test510 determines that thedigital camera10 is in the low-power image capture mode, the captured images are not displayed on theimage display32 in order to reduce the power consumption, and the process proceeds to the record capturedimages step525. This is appropriate, for example, when thedigital camera10 is mounted to a user's bike helmet while capturing a still image or a video clip, since, in this case, the user is unable to view theimage display32.
In some embodiments, if the user activates an appropriate user control to switch between the low-power image capture mode and the normal image capture mode while a digital video image is being captured, thecamera power manager42 switches the image capture mode between the low-power image capture mode and the normal image capture mode without interrupting the video capture process. For example, a user may mount thedigital camera10 in an appropriate position (for example on a tripod or a bicycle handlebar) and initiate a video capture process while thedigital camera10 is operating in the normal image capture mode. However, once the video capture process is initiated the user may desire to switch to the low-power image capture mode to conserve battery power after confirming that the image is properly framed. In response to activation of the appropriate user control, thecamera power manager42 will switch to the low-power image capture mode, without interrupting the video capture process.
In some embodiments, a live preview image is displayed on theimage display32 before an image capture operation is initiated when the digital camera is set to operate in the normal image capture mode, but no live preview image is displayed when the digital camera is set to operate in the low-power image capture mode.
In some embodiments, thedigital camera10 automatically enters the low-power image capture mode after a predefined period of inactivity (e.g., a period during which the user has not activated any camera features and the camera is not recording), or when the power level of the camera batteries43 (FIG. 1) falls below a predefined threshold.
It will be understood that when thedigital camera10 is set to operate in the normal image capture mode, captured digital images are displayed on theimage display32 as they are captured, and when thedigital camera10 is set to operate in the low-power image capture mode, captured digital images are not displayed on theimage display32 as they are captured. It will be further understood that the recorded digital images that were captured in either the normal image capture mode or the low-power image capture mode can be viewed on the image display32 (FIG. 1) at a later time when thedigital camera10 is set to a review mode.
FIG. 7A is a high-level diagram showing the components of theremote control module200 ofFIG. 1.FIG. 7B is a drawing of a front view of theremote control module200 shown inFIG. 7A according to one embodiment. Theremote control module200 can include awrist strap280 which secures theremote control module200 to a wrist of the user, or to some other object such as a bicycle handlebar. In this way, theremote control module200 can be accessible as the user engages in an activity such as mountain biking or surfing. In some embodiments, theremote control module200 can include a mounting interface that enables it to be mounted to various objects or surfaces. For example, theremote control module200 can include a tripod mount (similar to thefirst camera mount415 shown inFIG. 3C) or include a tab that is adapted to be connected to theslot482 in thebar mount470 ofFIG. 5B.
Theremote control module200 includes aprocessor220 which controls the functions of theremote control module200 using instructions stored infirmware memory228. In some embodiments, theprocessor220 is a microprocessor which also includes a read only memory (ROM) or a programmable read only memory (PROM) which stores firmware instructions that are executed by theprocessor220. In some embodiments, afirmware memory228 can be used to store firmware instruction. It will be understood that in some embodiments, theprocessor220 can be provided by custom circuitry (e.g., by one or more custom integrated circuits (ICs) designed specifically for use in wireless remote controls), or by a combination of programmable processors and custom circuits. It will be understood that connections between theprocessor220 and some or all of the various components shown inFIG. 7A can be made using a common data bus (not shown).
Theprocessor220 interfaces with a remotecontrol power manager248, which controls the power provided by remote batteries240, as will be described later in reference toFIG. 8. Theprocessor220 also interfaces with awireless modem250, which communicates with the digital camera10 (FIG. 1) over thewireless network52. As described earlier with reference to the wireless modem50 (FIG. 1) in thedigital camera10, thewireless modem250 in theremote control module200 can use various wireless interface protocols, such as the well-known Bluetooth wireless interface or the well-known 802.11 wireless interface, or various proprietary protocols.
Theprocessor220 receives inputs fromuser controls234 and controls astatus display232. The user controls234 can include astatus button270 for requesting status information for thedigital camera10, arecord button272 for initiating an image capture operation (e.g., a video record operation or a still image capture operation), and abook mark button274 for marking important portions of a captured video, as shown inFIG. 7B. It will be understood that in other embodiments, other types of user controls can be employed, such as described earlier in reference touser controls34 inFIG. 1. For example, a user control can be provided to enable the user to select between the firstimage capture system1A or the secondimage capture system1B. User controls234 on theremote control module200, such as therecord button272, that are used to send a command to thedigital camera10 can be referred to as command user controls.
Thestatus display232 can be a liquid crystal display (LCD) a group of light emitting diodes (LEDs), or can use any other display technology known in the art. Thestatus display232 includes status display elements for displaying status information pertaining to the digital camera10 (FIG. 1). For example, thestatus display232 shown inFIG. 7B includes a batterylevel display element260 for displaying a charge level of the camera batteries43 (FIG. 1) indigital camera10, a signalstrength display element262 for displaying a level of the signal received by thewireless modem250, a memoryfullness display element264 for displaying an indication of the fullness of the image memory30 (FIG. 1) in thedigital camera10, and atime display element266 for displaying time information. In some embodiments the time information can be the time obtained from a real-time clock (not shown) in thedigital camera10. In some embodiments, when thedigital camera10 is in the process of capturing a digital video the time information can be the elapsed time since a video recording operation (or a time-lapse photography operation) was initiated. It will be understood that in other embodiments, other types of display elements can be used to display other information that would be of interest to the user, for example the settings of various camera modes and parameters, as described earlier in reference toFIG. 2. In some embodiments, thestatus display232 can display a record status display element providing an indication of whether thedigital camera10 is currently recording a digital video (or a time-lapse digital image sequence). Alternately, the record status can be indicated by other means such as by providing a separate signal light, or by activating a back light for therecord button272.
In some embodiments, a singleremote control module200 can be used to control a plurality of differentdigital cameras10. In this case, theremote control module200 can include user controls that enable the user to specify which of the plurality ofdigital cameras10 should be controlled at a particular time.
FIG. 8 is a flowchart showing steps for managing the power in a digital camera system including thedigital camera10 and theremote control module200. In set low-power state step550, theprocessor220 in theremote control module200 controls the remotecontrol power manager248 in order to set theremote control module200 to operate in a low-power state after a period of inactivity. In some embodiments, the period of inactivity is a fixed predetermined period, such as 60 seconds. In other embodiments, the period of inactivity is a function of the power level of the remote batteries240. In other embodiments, the period of inactivity is a user-adjustable predetermined period. For example, the predetermined period can be an inactivity time value selected from a plurality of values (e.g., 10 seconds, 60 seconds, 5 minutes and 1 hour) selected using one of the user controls234 on theremote control module200. In some embodiments, the time value can be selected using the user controls34 (FIG. 1) on thedigital camera10, which then communicates the value to theremote control module200 over thewireless network52. Thestatus display232 and thewireless modem250 are powered down in the low-power state.
In user control activatedtest555, theprocessor220 in theremote control module200 determines whether one of the user controls234 has been activated by the user. If the user control activatedtest555 determines that none of the user controls234 have been activated by the user a maintain low-power state step560 maintains the low-power state described earlier in reference to the set low-power state step550.
If the user control activatedtest555 determines that one of the user controls234 has been activated by the user, a set normal-power state step565 is used to control the remotecontrol power manager248 in order to set theremote control module200 to operate in a normal-power state. In the normal-power state, power is supplied to thestatus display232 and thewireless modem250.
In sendstatus inquiry step570, theprocessor220 in theremote control module200 sends a status inquiry to thedigital camera10 over thewireless network52 using thewireless modem250. In response, thedigital camera10 sends status information back to theremote control module200 over thewireless network52 using thewireless modem50 in thedigital camera10.
In displaystatus information step575, the received status information is displayed on thestatus display232 of theremote control module200. The status information is displayed using the status display elements described earlier in reference toFIG. 7B (i.e., the batterylevel display element260, the signalstrength display element262, the memoryfullness display element264 and the time of day display element266).
Following displaystatus information step575, a user control activatedtest580 waits to see whether the user activates one of the user controls234 during the predefined time interval. If so, aperform operation step585 performs the operation requested by the user (for example, initiating an image capture operation). The displaystatus information step575 is then called to update the information displayed on thestatus display232 accordingly. If the user control activatedtest580 does not detected the activation of anyuser controls234 during the predefined time interval, the set low-power state step550 is repeated to return theremote control module200 to the low-power mode.
In some embodiments, at least some of the status display elements on theremote control module200 are powered down after a predefined second shorter time interval. This enables theremote control module200 to conserve additional power while it remains in the normal-power mode. In this case, certain status display elements may remain powered up as appropriate. For example, a record status display element may remain powered up during the time that a digital video is being captured even if the user has not interacted with the user controls.
In some embodiments, thedigital camera10 can transmit captured digital images (either digital still images or digital videos) to theremote control module200 over the wireless connection for display on thestatus display232. For example, during the time that thedigital camera10 is capturing a digital video, a temporal sequence of video frames can be transmitted to theremote control module200 so that the user can monitor the capture process. In some cases, thedigital camera10 may down-sample the video frames spatially or temporally before transmitting them to theremote control module200 in order to minimize the amount of bandwidth required to transmit the video frames. Similarly, if thedigital camera10 is operating in a still capture mode, a sequence of preview images can be transmitted to theremote control module200 to allow the user to determine an appropriate time for initiating an image capture operation.
The invention has been described in detail with particular reference to certain preferred embodiments thereof, but it will be understood that variations and modifications can be effected within the spirit and scope of the invention.
PARTS LIST- 1A image capture system
- 1B image capture system
- 4A lens
- 4B lens
- 10 digital camera
- 12 timing generator
- 14A image sensor
- 14B image sensor
- 16A ASP and A/D Converter
- 16B ASP and A/D Converter
- 17 multiplexer
- 18 buffer memory
- 20 processor
- 22 audio codec
- 24 microphone
- 26 speaker
- 28 firmware memory
- 30 image memory
- 32 image display
- 34 user controls
- 36 display memory
- 38 wired interface
- 40 computer
- 42 power manager
- 43 camera batteries
- 44 video interface
- 46 video display
- 48 interface/recharger
- 50 wireless modem
- 52 wireless network
- 70 Internet
- 72 service provider
- 74 wireless access point
- 90 white balance setting
- 95 white balance step
- 100 color sensor data
- 105 noise reduction step
- 110 ISO setting
- 115 demosaicing step
- 120 resolution mode setting
- 125 color correction step
- 130 color mode setting
- 135 tone scale correction step
- 140 contrast setting
- 145 image sharpening step
- 150 sharpening setting
- 155 image compression step
- 160 compression mode setting
- 165 file formatting step
- 170 metadata
- 175 user settings
- 180 digital image file
- 185 camera settings
- 190 environmental readings
- 200 remote control module
- 220 processor
- 228 firmware memory
- 232 status display
- 234 user controls
- 240 remote batteries
- 248 remote control power manager
- 250 wireless modem
- 260 battery level display element
- 262 signal strength display element
- 264 memory fullness display element
- 266 time of day display element
- 270 status button
- 272 record button
- 274 bookmark button
- 280 wrist strap
- 400 camera body
- 401 image path control
- 402 capture operation control
- 403 power control
- 410 first surface
- 415 first camera mount
- 420 second surface
- 425 second camera mount
- 430 third surface
- 440 fourth surface
- 442 removable memory card
- 444 memory card access door
- 446 connector access door
- 450 quick release tab
- 452 screw
- 454 edge portion
- 456 central portion
- 460 helmet mounting clip
- 462 slot
- 470 bar mount
- 474 bar
- 476 straps
- 480 mount rail
- 482 slot
- 484 arrow
- 490 mount base
- 492 holes
- 493 spring
- 494 washer
- 495 screw
- 500 set capture mode step
- 505 initiate capture operation step
- 510 low-power mode test
- 515 display captured images step
- 525 record captured images step
- 550 set low-power state step
- 555 user control activated test
- 560 maintain low-power state step
- 565 set normal-power state step
- 570 send status inquiry step
- 575 display status information step
- 580 another user control activated test
- 585 perform operation step
- A optical axis
- B optical axis