RELATED APPLICATIONThis application claims the priority benefit of U.S. Provisional Patent Application No. 61/974,893, filed on Apr. 3, 2014, and entitled “Auxiliary Photography Systems for Mobile Devices,” the entire contents of which are hereby incorporated by reference herein and made part of this specification for all that they disclose.
BACKGROUND OF THE INVENTIONS1. Field of the Inventions
This invention relates generally to cameras and photography, and specifically to cameras and photography accessories and applications for mobile devices (e.g., mobile telephones, mobile texting devices, electronic tablet devices, laptop computers, desktop computers, gaming devices, and/or devices capable of linking electronically to another device or to a network such as the Internet, etc.)
2. Description of the Related Art
In recent years, many advances in computer networking and processing technology have made it possible for mobile devices to include cameras that permit users to capture images and videos. In many cases, these images and videos can be stored, processed, manipulated, and transmitted. However, there are many design constraints on onboard cameras in mobile devices that can limit the weight, size, expense, shape, adjustability, and features of such camera systems. Consequently, many cameras and related components in mobile devices are inadequate for certain photographic needs or may not otherwise provide a wide array of features.
SUMMARY OF DISCLOSURESome aspects of this disclosure relate to camera systems that can be used with mobile devices to capture and/or record pictures and/or video. In some embodiments, one or more camera systems may be removably attachable to one or more mobile devices, or the one or more camera systems may be independent from and/or non-attachable with one or more mobile electronic devices, and configured to communicate with one or more mobile electronic devices. One or more auxiliary camera systems may be used with a mobile device, such as a mobile electronic device that includes its own onboard camera system. The one or more auxiliary camera systems may include electronic sensors for capturing light, and internal electronics for processing, storing, and/or transmitting images. For example, an auxiliary camera system may be activated by a mobile device to capture and/or record an image, and may transmit the image to the mobile device.
The use of a camera system that is separate from the mobile device can allow the camera to be positioned in a different location than the mobile device, allow multiple cameras to be operated by a single mobile device, provide improved or additional photographic capabilities in comparison with those provided by an onboard camera of the mobile device, and/or provide photographic capabilities for mobile devices that do not have onboard cameras, etc. For example, a separate, dedicated camera system may include a larger and/or higher quality photographic sensor than the photographic sensor of a mobile device's onboard camera. In some embodiments, a single mobile device may use its onboard camera in conjunction with, or generally simultaneously with, one or more removably attachable auxiliary cameras to record different types of images, such as multiple images from different angles, three-dimensional images, images with higher resolution than in an onboard camera in a mobile electronic device, and/or images with different levels of light filtering, magnification, polarization, light sensitivity (e.g., in the visible and/or infrared ranges), and/or aspect ratio, etc. In some embodiments, a single mobile device may control multiple separate camera systems to capture and/or record images from different angles with respect to the same subject or scene.
Some aspects of the disclosure relate to techniques for remotely activating one or more cameras, lighting, flashes, and/or other features of remote devices. In some embodiments, a camera, lighting, flash, and/or other system or device may be physically separate from a mobile electronic device (e.g., not physically connected and/or not able to communicate via a wired connection). The mobile device may activate the camera, lighting, or flash (or some other feature) by using a wireless communication connection (e.g., Bluetooth® or WiFi). In some embodiments, the mobile device may use an onboard flash or lighting component to use light to communicate with (e.g., to activate and/or control) a remote auxiliary component, such as a camera or flash device. For example, a remote camera may detect the flash from the mobile device and proceed to take a picture, trigger its own flash, and/or activate some other feature. In some embodiments, a mobile device, a remote camera, a remote flash device, or some other device may detect a user signal (e.g., a specific movement, noise, and/or gesture, etc.) by a person and trigger a responsive function, such as the capture of an image, the activation of a flash, or the activation of some other feature. In some embodiments, one mobile device may remotely activate features of one or more other mobile devices using the same or similar techniques.
Some aspects of the present disclosure relate to lighting or flash systems capable of automatically controlling and adjusting various operational parameters related to generating lighting or flashes for photography or videography. In some embodiments, a lighting or flash system may include one or more gyros, accelerometers, and/or other sensors which detect the position, movement, direction, and/or orientation of the lighting or flash system. The lighting or flash system may process information from the sensors in order to adjust the light or flash that the system generates (e.g., intensity, duration, color, etc.). In some embodiments, a lighting or flash system may include one or more adjusters, such as one or more servomechanisms (“servos”) or motors that can automatically adjust the direction in which a light or flash is to be generated. For example, the lighting or flash system may process information obtained from various sensors and automatically adjust the orientation of the flash with respect to the subject and/or camera in order to achieve better illumination of a subject or to obtain some other desired effect.
In some embodiments, a method for remotely performing one or more functions of an auxiliary photographic system can be configured to be used with a mobile communication device. For example, the method can include: (a) establishing communication between the mobile communication device and a plurality of photographic accessories and between each of the plurality of photographic accessories, wherein the plurality of photographic accessories are configured to be physically separate from the mobile communication device and to be physically separate from each other; (b) receiving, at each of the plurality of photographic accessories, one or more commands from the mobile communication device and one or more signals; (c) remotely controlling one or more operational parameters of the plurality of photographic accessories at least in part based on the one or more signals; (d) remotely performing one or more functions of the plurality photographic accessories at least in part based on the one or more commands, wherein the one or more functions of the plurality of photographic accessories are at least in part synchronized in response to the one or more commands; (e) capturing an image of a subject based on the plurality of the photographic accessories conjointly performing the one the one or more functions and controlling the one or more operational parameters; and/or (f) adjusting the orientation and/or position of the altering component relative to the subject.
In some embodiments, the method can additionally or alternatively include (a) controlling the plurality of photographic accessories based at least in part on the one or more commands from a single mobile communication device; (b) sending a command, by the mobile communication device over at least one of a wired or wireless electronic communication connection; (c) sending the one or more signals by an onboard component of the mobile communication device, wherein the one board component sends one or more signals in the form of at least one of light and sound; and/or (d) sending, by at least one of the plurality of photographic accessories, one or signals, wherein the one or more signals is a result of the at least one photographic accessory performing one or more functions.
In some embodiments, one or more of the photographic accessories can be: (a) a remote camera configured to convey photographic information to the mobile communication device; and/or (b) a photographic altering component configured to alter an image to be captured.
BRIEF DESCRIPTION OF THE DRAWINGSEmbodiments of various inventive features will now be described with reference to the following drawings. Certain comments and descriptions are provided in the drawings as examples, but the comments and descriptions should not be understood to limit the scope of the inventions or to provide the only possible applications, structures, or usage for the illustrated examples. Throughout the drawings, reference numbers may be re-used to indicate correspondence between referenced elements. The drawings are provided to illustrate example embodiments described herein and are not intended to limit the scope of the disclosure.
FIGS. 1A-1C illustrate an embodiment of a mobile camera system.
FIGS. 1D-F illustrate embodiments of configurations of a mobile camera system and a mobile device.
FIGS. 1G-1H illustrate embodiments of a mobile camera system communicating with a mobile device or other components via wired communication.
FIGS. 1I-1J illustrate embodiments of operating a mobile camera system in communication with a mobile device.
FIGS. 2A-2E illustrates an embodiment of a remote trigger and/or control system for use with a mobile device.
FIGS. 3A-3D illustrates an embodiment of a remote camera system for use with a mobile device.
FIGS. 4A-C illustrates a mobile device communicating wirelessly with one or more other mobile devices or mobile components to control and/or to coordinate various functions of the one or more mobile devices and/or components.
FIG. 5A illustrates a mobile device configured to respond to one or more user signals.
FIG. 5B is a flow diagram of an illustrative process for responding to one or more user signals.
FIGS. 6A-6O illustrates an embodiment of a lighting or flash system.
FIGS. 7A-7F illustrates example uses of a lighting or flash system.
FIGS. 8A-8B illustrate examples of adjusting an operating parameter of a lighting or flash system based on a photographic subject.
FIGS. 9A-9O illustrates another embodiment of a lighting or flash system.
DETAILED DESCRIPTION OF EMBODIMENTSThe present disclosure relates generally to auxiliary photography systems for mobile devices, such as cameras, lighting, flashes, and/or programming instructions or applications for mobile devices, etc. Mobile electronic devices are mobile devices with electronic capabilities. Mobile communication devices are mobile electronic devices that are capable of communicating remotely, either in a wired or wireless manner, with another electronic device. Many different structures, features, steps, and processes are shown and/or described in discrete embodiments for convenience, but any structure, feature, step, or process disclosed herein in one embodiment can be used separately or combined with or used instead of any other structure, feature, step, or process disclosed in any other embodiment. Also, no structure, feature, step, or processes disclosed herein is essential or indispensable; any may be omitted in some embodiments.
The terms “mobile electronic devices” and “mobile devices” in this specification are used in their ordinary sense, and include mobile telephones, mobile texting devices, media players, electronic tablet devices, laptop computers, desktop computers, gaming devices, wearable electronic devices (e.g., “smart watches” or “smart eyewear”), and/or mobile electronic communication devices capable of linking electronically to another device or to a network such as the Internet, etc. Some mobile electronic devices include one or more onboard cameras that can be used for various imaging purposes, such as photography and video recording. In addition, some mobile electronic devices include one or more illumination components, such as one or more lights, and/or flashes, etc., that can be used for photography, videography, and/or other purposes (e.g., as a flash light).
The term “camera” in this specification is used in its ordinary sense, and includes cameras configured for still photography, videography, or both. The cameras described herein may include one or more different lenses, or may be used with one or more auxiliary lenses. In addition, the cameras described herein may include or be configured for use with one or more illumination sources, such as lights and/or flashes.
The terms “flash” and “flash component” in this speciation are used in their ordinary sense, and generally refer to electronic flash units that include LEDs, xenon-based bulbs, or other illumination sources. Each time this specification refers to “flash,” or any related or similar term, it should also be understood to refer to and encompass, either additionally or alternatively, a light source of any type, such as a pulsating light, or a generally constant light source, or a long-duration light source.
The term “lens” in this specification is used in its ordinary sense, and includes powered lenses (e.g., lenses that focus, magnify, enlarge, or otherwise alter the direction of light passing through the lens), plano lenses (e.g., lenses that are generally planar, lenses that do not taper in thickness, and/or lenses that are not powered), simple lenses, compound lenses, generally spherical lenses, generally toroidal lenses, generally cylindrical lenses, etc. Any imaging device described or illustrated in this specification can include a retainer attached to one or more lenses or optical regions with one or more different features, including but not limited to a constant or variable magnifying lens, a wide-angle lens, a fish-eye lens, a telescopic lens, a macro lens, a constant or variable polarizing lens, an anti-reflection lens, a contrast-enhancing lens, a light-attenuating lens, a colored lens, or any combination of the foregoing, etc.
Referring toFIGS. 1A and 1H, illustrative embodiments of a removablyattachable camera system100 for amobile device120 are shown. In some embodiments, amobile device120 can control thecamera system100 wirelessly, as depicted inFIGS. 1D-1F,1I and1J (or via a wired connection, as depicted inFIG. 1G), such as to capture and/or record photos, videos, sound, etc., on demand. Themobile device120 can control thecamera system100 to capture and/or record photos on a timer or in response to some event (e.g., detection of a sound, detection of a light flash), to transfer recorded images, sound, and/or other data to themobile device120 or some other device, to power thecamera system100 on and off, etc.
In some embodiments, as illustrated inFIGS. 1A-1C, thecamera system100 may include aretainer portion102 with one or more attachment regions (e.g.,side walls106 and108, as described in more detail below) for removably or permanently attaching one ormore lenses104 lenses to the retainer. Thecamera system100 may also include asensor110 for capturing light (e.g., light received from or transmitted by a lens104) and recording images. As shown, thesensor110 may be coupled to theretainer102, and may be optically aligned with alens104 that is removably or permanently attached to theretainer102. In some embodiments, thecamera system100 may include a microphone (not shown) for recording sound in conjunction with, or independently from, recording video.
Theretainer102 or some other portion of thecamera system100 may include or contain circuitry and/or other electronics (not shown) for providing additional features, such as storage of the images captured by thesensor110, processing of the images, transmission of the images and/or any other data to a computer, to amobile device120, to a memory, and/or to anothercamera system100, via wired or wireless communication with themobile device120 and/or other devices, etc. In some embodiments, thecamera system100 may include a removable memory module (not shown), such as a flash memory module, that can be read by amobile device120 or other computing device, exchanged withother camera systems100, replaced with other memory modules of the same or different storage capacity, etc. Theretainer102 or some other portion of thecamera system100 may also contain or include a battery for powering thesensor110 and other electronics.
Theretainer102 may include first andsecond sidewalls106,108 that are sized, shaped, and/or oriented to removably attach thecamera system100 to amobile device120. For example, as illustrated inFIGS. 1D-1F, the first andsecond sidewalls106,108 may form a channel into which a portion (e.g., a corner portion) of amobile device120 may be inserted. Thesidewalls106,108 may secure thecamera system100 to themobile device120 using a friction fit, by “pinching” themobile device120, etc. In some embodiments, theretainer102 may extend less than the entire length of one or more edges of themobile device120 onto which it is installed, minimizing the amount of themobile device120 that is obstructed when the camera system is installed. In addition, the relatively small size of thecamera system100 in comparison with themobile device120 enhances portability. For example, theentire camera system100 may be substantially smaller than a face of themobile device120 to which it is attached (e.g., in some embodiments covering only a corner region or only an edge region of the mobile device120), and/or small enough to be carried in a user's pocket, on a user's key ring, etc. Some embodiments of theretainer102 orcamera system100 may incorporate or use any of the various structures, features, and/or methods described in U.S. Pat. No. 8,279,544, titled “Selectively Attachable and Removable Lenses for Mobile Devices,” which issued on Oct. 2, 2012, the contents of which are hereby incorporated by reference in its entirety.
FIGS. 1D-1F illustrate that thecamera system100 may be used with amobile device120 in multiple orientations or positions. For example, themobile device120 can be pivoted, flipped, or rotated, and then thecamera system100 can be attached to themobile device120 in multiple positions. Thecamera system100 may be positioned in particular orientations or positions for user comfort and ease of use. For example, thecamera system100 andmobile device100 may be configured in such a way that a user may operate the combined configuration while themobile device120 is in a portrait orientation (e.g., larger vertical dimension than horizontal dimension), as shown inFIGS. 1D and 1E. Alternatively, thecamera system100 andmobile device100 may be configured in such a way that a user may operate the combined configuration while themobile device120 is in a landscape orientation (e.g., larger horizontal dimension than vertical dimension), as shown inFIG. 1F.
Somemobile devices120, as shown inFIGS. 1E and 1F, may include anonboard camera122. Thecamera system100 may be installed onto themobile device120 such that theonboard camera122 is partially or completely obstructed by thecamera system100, as illustrated inFIG. 1D. In such cases, thecamera system100 may effectively temporarily replace theonboard camera122, such as when thecamera system100 includes a larger, higher-resolution, and/or higher-quality sensor110, alens104 configured to provide one or more desired visual or optical effects or features (such as any optical or visual effects or features described elsewhere in this specification). In some embodiments, as shown inFIGS. 1E and 1F, thecamera system100 may also or alternatively be installed onto themobile device120 such that theonboard camera122 of themobile device120 is not obstructed. For example, thecamera system100 may be installed onto a different side, corner, or other portion of themobile device120 than theonboard camera122. In this configuration, thecamera system100 may be used in conjunction with theonboard camera122, such as to capture and/or record images from different positions or angles with respect to a subject, or with different photographic effects or attributes (such as any optical effects or features described elsewhere in this specification). In some embodiments, the images may then be combined by thecamera system100,mobile device120, and/or some other device for any suitable purposes, such as to form three dimensional images.
Referring toFIG. 1G, thecamera system100 may communicate with themobile device120 via a wired connection, such as a data and/orpower cable140. In such cases, thecamera system100 may include aport144, such as a mini USB port, a micro USB port, a Lightning® port, or the like. Thecable140 can be coupled to theport144 of thecamera system100 and aport142 of themobile device120 to facilitate wired electronic communication of data and/or transfer of electronic power. In some embodiments, as shown inFIG. 1H, adock130 may be used to charge the battery of thecamera system100, transfer data to and/or from a mobile device or other computing device, and the like. Dock130 may include a data orpower connecter136 for accepting an electrical connection withcamera system100, such that data may be transferred betweendock130 andcamera system100 or dock130 may supply power to the batter ofcamera system100. A data orpower connector port132 may be included for accepting a cable or other connection to an external source (e.g., a personal computer, laptop or the like for exchanging of data or an external power source, such as a wall outlet or the like for supplying power to the battery of camera system100). In some embodiments,dock130 may include anindicator134 such as a light (e.g., an LED), a speaker, and/or a screen, for indicating statuses and/or other information. Theindicator134 may be indicative of an electrical connection between thecamera system100 anddock130, a recharge complete status of the battery of thecamera system100, a data transfer status, or any status of electrical communication between thecamera system100 anddock130 and/or an external unit (not shown).
Referring toFIGS. 1I and 1J, thecamera system100 may be controlled by or otherwise communicate with themobile device120. Thecamera system100 may include a wireless communication module (not shown) to facilitate wireless communication with a mobile device120 (e.g., via Bluetooth® or WiFi), and a user may use themobile device120 to activate or program thecamera system100 to perform functions of thecamera system100. For example, a user may set up thecamera system100 at a particular location from which photographs are desired, and that location may be physically separate or remote from the user'smobile device120, as illustrated inFIG. 1J. As illustrated inFIG. 1I, the user may also attach thecamera system100, as described in reference toFIGS. 1D-1F, and maintain communication with themobile device120 without the need of a wired connection.
FIGS. 2A-2E shows embodiments of a remote base, such as aremote trigger system200, that may be controlled by or otherwise communicate with amobile device120. As shown inFIGS. 2A and 2B, various modular devices, forexample module device210 having acamera component212 and aflash component214, may be attached to theremote trigger system200, and theremote trigger system200 can facilitate remote activation or control of the modular devices by amobile device120. In some embodiments, as illustrated inFIG. 2E and as explained in more detail below, the modular devices may include two or more separable modular components, such as a camera device, a lighting device, a flash device, and/or a microphone, and/or some combination thereof, etc. Theremote trigger system200 may include awireless communication module205 to facilitate wireless communication with a mobile device120 (e.g., via Bluetooth® or WiFi), and a user may use themobile device120 to activate or program the modular device(s) attached to theremote trigger system200, as illustrated inFIG. 2D. For example, a user may set up theremote trigger system200 at a particular location from which photographs are desired, and that location may be physically separate or remote from the user'smobile device120. The user may attach amodular camera device210 to theremote trigger system200. Theremote trigger system200 can then activate and use features of the modular camera device210 (or other modular device disclosed and/or illustrated in this specification, or any other features) according to the commands received from the mobile device120 (e.g., take pictures on-demand, according to a pre-set timer, in response to an input or other event, etc.).
A software application may be installed on or provided with amobile device120 for controlling the remote base orremote trigger system200. The application may allow users to control one or moreremote trigger systems200, access individual features and settings of the modular devices attached to theremote trigger systems200, receive data (e.g., images, sound) from theremote trigger systems200, etc.
In some embodiments, as shown inFIGS. 2A and 2B, the remote base or theremote trigger system200 may include one or more wiredelectronic contacts202 for communicating with modular devices when the devices are attached to theremote trigger system200. Theremote trigger system200 may also include one ormore indicator components204, such as a light (e.g., an LED), a speaker, and/or a screen, for indicating statuses and/or other information. Theremote trigger system200 may include a wireless communication module to facilitate wireless communication withdevice120 and/or other remote trigger systems. Theremote trigger system200 may further include aninternal battery208 for powering the components described above, the corresponding internal circuitry, and the modular devices attached to theremote trigger system200. Theremote trigger system200 may also include, as shown inFIG. 2C, one ormore data ports206, such as AC power ports, DC power ports, network ports, mini USB ports, micro USB ports, Lightning® ports, headphone jacks, and the like to recharge theinternal battery208 and/or to facilitate wired communication using acable250.
In some embodiments, theremote trigger system200 may include atrigger input207, such as button or touch sensitive surface to enable a user to activate the features of theremote trigger system200 independently of themobile device120. In some embodiments, the application provided withmobile device120 may be configured to allow users to control one or moreremote trigger systems200 via a singleremove trigger system200. For example, the application ofmobile device120 may allow users to synchronize multiple remote trigger systems to atrigger input207 of a oneremote trigger system200. In this way, a user may be able to operate one or more remote trigger systems while physically separated from themobile device120.
Theremote trigger system200 may be shaped and/or sized to provide a highly portable base and/or remote trigger system for use withmobile devices120. For example, theremote trigger system200 may be substantially narrower, shorter, and/or thinner than the typical mobile device with which it is used (e.g., less than or equal to about half as wide and/or less than or equal to about half as tall as a mobile phone). This size can allow one or moreremote trigger systems200 to be carried more easily in a user's hand or pocket, mounted in a wide range of locations, etc. In addition or alternatively, theremote trigger system200 may includecomponents209 that facilitate mounting on traditional tripods and other photographic mounts. Modular devices may be configured to work with theremote trigger system200 by removably attaching to theremote trigger system200 using a friction fit, a snap-on attachment, or any other attachment mechanism. The modular devices may electronically communicate with theremote trigger system200, such as via electronic contacts on the modular device and correspondingelectrical contacts202 on theremote trigger system200, via a cable coupled to aport206 of theremote trigger system200, or wirelessly, depending upon the configuration and capabilities of theremote trigger system200, the specific modular devices, and/or the wishes of a user. Anindicator component204 can provide information and feedback to a user regarding the state of the remote trigger system, the state of the attached modular devices, the operation of the modular devices, state of wired or wireless connectivity between remote trigger system, module device and/or mobile device, etc. For example, a modular device may be activated by amobile device120 to perform a particular function (e.g., capturing a photograph), and theindicator component204 can flash or change color to indicate success or failure.
Different modular devices may provide one or more of a variety of different features, including photography, lighting, and/or sound capture, and the like. In some embodiments, as shown inFIGS. 2A and 2B, amodular device210 may include acamera component212 and aflash component214. Thecamera component212 may include an electronic sensor (not shown) for capturing images, and a lens for transmitting light to the sensor and optionally modifying the light. In some embodiments, thecamera component212 may be configured to receive various removably attachable lenses, such as lenses configured to magnify, darken, filter, or otherwise alter light based on the wishes of the user. The individual removably attachable lenses may be coupled to thecamera module210 orremote trigger system200 as described above or according to any attachment mechanism known to those of skill in the art. Theflash component214 may also or alternatively be configured to receive various removably attachable flash elements, such as flash elements capable of emitting light with different colors, intensities, and the like.
In some embodiments, as shown inFIG. 2E, a singlemobile device120 may control, either directly or indirectly, multiple (e.g., two or more) separate remote base or remote trigger systems. For example, as shown, a user may use amobile device120 to control multiple remote base or remote trigger systems200A,200B,200C, and200D to which various combinations of modular devices have been coupled, including but not limited to one or more: lighting modules, camera modules, flash modules, microphone modules, and/or static lighting modules, etc. In some embodiments, a singleremote trigger system200 may be controlled by multiple separatemobile devices120. For example, a singleremote trigger system200 to which a camera module has been coupled may be controlled by multiple mobile devices, in some cases generally simultaneously (e.g., one user may use amobile device120 to instruct theremote trigger system200 to record video of a subject, and a second user may use a secondmobile device120 to instruct the sameremote trigger system200 to capture a still image of the subject at the same time). In some embodiments, remote base orremote trigger systems200 may communicate with each other to exchange data, share connections, and/or synchronize operation, etc.
In some embodiments, a plurality of remote base modules orremote triggers systems200 in communication with amobile device120, can each be attached electronically and/or physically (either unitarily or removably) to one or more information-capturing devices and/or one or more visual effect devices (e.g., one or more: cameras, microphones, lighting, and/or flash devices), or amobile device120 can be in direct electrical communication (wired or wireless) with one or more information-capturing and/or visual effect devices, in such a way that generally simultaneous information feeds (e.g., one or more different video, photo, and/or sound feeds) can be provided at about the same time to themobile device120 from the same scene and/or the same subject, as illustrated, to accomplish real-time or near-real-time multiplexing from different data sources. In some embodiments, the screen of themobile device120 can be configured, such as by an application, to display multiple, generally simultaneous images (e.g., photo or video) from different viewpoints and/or angles at about the same time. In some embodiments, themobile device120 can be configured to continuously choose from among a plurality of different photographic (e.g., photo or video) feeds to record and store a real-time or near-real-time collection of images.
FIGS. 3A-3D illustrate examples of a photographic accessory in the form of amobile camera device300 that may be used with amobile device120. In some embodiments, themobile camera device300 may include an onboard camera oronboard camera lens302, an onboard or internal processor, a memory, a wireless communication module, a power supply such as a rechargeable battery, and/or one or more photographic altering or enhancing devices or components, such as aflash element304 or a lighting element (such as a photographic soft-glow lamp, a photographic reflector, a photographic diffuser, and/or a photographic filter, etc.), and/or various other components. Amobile device120 may communicate with one or moremobile camera devices300 in a manner that is the same as or similar in any respects to the communication with theremote trigger systems200 described in this specification. Amobile device120 may communicate with and control thecamera devices300 using wireless communication technology, such as Bluetooth®. For example, as shown inFIG. 3C, a singlemobile device120 may be configured by anapplication190 running on themobile device120 to establish a wireless connection with variousmobile camera devices300, to exchange wireless communications with themobile camera devices300, to modify photographic and other operational settings, to activate themobile camera devices300, to capture photos, video, and/or sound, and/or to generate lighting and/or flashes, etc.
In some embodiments, themobile camera devices300 may be shaped and/or sized to enhance or maximize portability. For example, amobile camera device300 may be smaller than the mobile device with which it is used (e.g., less than or equal to about: half as wide and/or half as long). The portability of themobile camera devices300 can allow a single user to carry a plurality ofmobile camera devices300 in a pocket, back, or case, to a desired location.
In some embodiments, as illustrated inFIG. 3C, a user may mount multiple (e.g., two or more) mobile camera devices300A,300B,300C ontripods330a-dor place the mobile camera devices on various surfaces to create a mobile studio capable of recording images, video, and/or sound of a subject from various angles. Anapplication190 may be installed on or provided with amobile device120 for controlling and monitoring the variousmobile camera devices300 as described above. The application may provide real-time or recorded displays of each mobile camera device's300 field of view, allow the user mix and edit images, video, and/or sound from each of themobile camera devices300 into a one or more presentations, etc.
In some embodiments, as shown inFIG. 3D, an electrical source and/or connection, such as adock320, may be provided to charge an internal battery ofmobile camera device300, transfer data to and/or from themobile camera device300 or other computing device, and the like. Thedock320 may include one or more wired electronic contacts (not shown) for communicating with one or moremobile camera devices300 when the devices are attached to thedock320. Thedock320 may include multiple ports or contacts for accepting multiplemobile camera devices300 as shown inFIG. 3D. A data or power connector port for accepting a cable, such as a mini USB port, a micro USB port, a Lightning® port, or the like, may be provided ondock320, and thus themobile camera device300, to facilitate wired electronic communication of data and/or transfer of electronic power. In some embodiments,dock320 may include an indicator such as a light (e.g., an LED), a speaker, and/or a screen, for indicating statuses and/or other information. The indicator may be indicative of an electrical connection between themobile camera device300 anddock320, a recharge complete status of the battery of themobile camera device300, a data transfer status, or any status of electrical communication between themobile camera device300 anddock320 and/or an external unit (not shown).
FIGS. 4A-4C illustrate mobile devices communicating wirelessly to exchange information, synchronize operation, remotely activate features, and the like. For example, one mobile device400A (the “master” mobile device) may emit wireless signals that are received and processed by one or more additional mobile devices400B-400D (the “slave” mobile devices). In response, the slave mobile devices400B-400D may perform some function, such as taking a photograph, emitting a flash, launching an application, or some other function. The particular form of “wireless” communication between the mobile devices is not limited to traditional wireless networking (e.g., Bluetooth®, WiFi). Rather, the mobile devices may also or alternatively communicate sonically, luminously, via motion detection, or via other wireless means.
In some embodiments, as shown, mobile devices may include various input and/or output components, such as aspeaker410, anonboard camera430, aflash440, microphone (not shown), some combination thereof, etc. Asoftware application420 may be installed on or provided with one or more mobile devices400A,400B. Theapplication420 may allow a mobile device, such as a master mobile device400A, to communicate with another mobile device, such as a slave mobile device400B, to operate thecamera430 and/orflash440 of the slave mobile device400B, or to cause the slave mobile device400B to perform some other function. For example, when communicating using a wireless networking protocol such as Bluetooth®, each mobile device may include a Bluetooth® transceiver that performs the functions of both a transmitter and a receiver. In some embodiments, when communicating sonically, a mobile device may use aspeaker410 to perform the functions of a wireless transmitter, and another mobile device may use a microphone to perform the functions of a wireless receiver. In some embodiments, when communicating luminously, a mobile device may use aflash440 or display screen to perform the functions of a wireless transmitter, and another mobile device may use acamera430 to perform the functions of a wireless receiver.
Theapplication420 may cause the master mobile device400A to emit a single wireless signal or a sequential pattern of wireless signals. Acorresponding application420 may be installed on the slave mobile device400B, and may configure the second mobile device400B to recognize the wireless signal. As described above, the specific wireless signal may be a traditional wireless networking signal, such as a Bluetooth® or WiFi signal. In some embodiments, theapplication420 of the master mobile device400A may cause thespeaker410 to emit a sound or sequence of sounds, and thecorresponding application420 of the slave mobile device400B may receive the sound or sequence of sounds (or data derived therefrom) from the microphone of the mobile device400B. Theapplication420 may process the sounds or sound data and determine that they relate to a command to activate a particular feature, such as aflash440. In response, theapplication420 may cause the slave mobile device400B to activate theflash440. In some embodiments, theapplication420 of the master mobile device400A can additionally or alternatively cause a flash component (not shown on the master mobile device400A) to emit a single flash or a sequence of flashes450A. Thecorresponding application420 of the slave mobile device400B may receive the flash or sequence of flashes450A (or data derived therefrom) along line of sight455B from thecamera430 of the mobile device400B, and process the flashes or flash data450A similar to the sound processing described above.
Different wireless signals can provide different benefits. For example, wireless networking signals, such as signals transmitted according to the Bluetooth® standard, may be transmitted by a master mobile device400A to a slave mobile device400B that is not within the line of sight455B of the master mobile device400A (e.g. to a slave mobile device400B that is in another room, that is obscured within a container such as a hand bag, etc.). The use of sound signals can provide similar benefits. Light signals (e.g., flashes, infrared light), on the other hand, are typically only received by a slave mobile device400B that is within the line of sight455B of a master mobile device400A transmitting the light-based signals. However, light-based signals may be used in noisy environments, and in situations when wireless networking via Bluetooth® or some other wireless networking standard is not possible or practical (e.g., when one or both of the mobile devices400A,400B are not configured or otherwise capable of such standardized wireless networking).
Selection of a master device, or identification of a device as a master or a slave, may be performed explicitly, such as when a user specifies a particular device as the master and other devices as slaves. The master/slave determination may also or alternatively be implicit, such as when a user uses a particular device to establish a communication session. In this example, other devices that join the communication session or subsequently communicate with the master may automatically become slaves. In some embodiments, the master/slave distinction may be transitory or may be made on an ad hoc basis. For example, a user may use a particular device to transmit commands or messages to some other device. In this example, the sending device may be the master and the target of the message or command may be a slave. However, the slave device may subsequently send a message or command to the master device, and the devices may effectively swap roles.
In some embodiments, as shown inFIG. 4B, more than two mobile devices may be synchronized or otherwise configured to communicate using the various wireless methods described above. A user may arbitrarily choose any device to operate as the master, and the remaining devices may automatically become slaves. The choice of master and slave devices may be permanent, may be maintained until changed by a user, or may be temporary (e.g., only maintained for one particular communication session). The user may use anapplication420 on one of the mobile devices to select or set the master device. A wireless signal may be transmitted to the other devices configured to identify the master device or to instruct the other devices that they are slaves. In some embodiments, a user may use theapplication420 on each mobile device identify the device as either a master or slave, or to set which device is to be the master of a current device.
The example interactions described above are illustrative only, and are not intended to be limiting. In some embodiments, mobile devices may use theapplication420 and the various wireless transmitters and receivers described above to exchange information and send commands regarding any function that can be performed by the mobile device, including but not limited to taking a photograph, emitting a flash, recording video, recording sound, playing music, launching or activating another application, presenting some user-perceived output, etc. In some embodiments, a mobile device may be configured to recognize multiple (e.g., two or more) different sequences of wireless input and perform different functions responsive to the particular input received. For example, the mobile device can determine a particular message or command that corresponds to some specific input, and perform a function based on the determined message or command.
In some embodiments, as illustrated inFIG. 4C, a mobile device or group of mobile devices may be configured (e.g., byapplications420 executing thereon) to recognize commands and trigger functions based on input received from sources other than mobile devices. For example, a mobile device400B executing anapplication420 may recognize a single flash or a particular sequence offlashes475 fromelement470, which may represent one or more flashlights, stand-alone camera flashes, vehicle headlights, light bulbs, strobe lights, lightning, combustion (e.g., fires), flares, fireworks, explosions, etc. For example, the mobile device400B may detect a flash or sequence offlashes475 or from master mobile device400A, and relay flash450B from thecamera440 of mobile device400B. The relay flash450B may be indicative that mobile device400B has receivedflash475 or450B and activated one or more functions of the application included in mobile device400B. Such functions may include automatically taking a photographic image, recording a video, turning on/off the mobile device400B, or any other function programmed into mobile device400B based on the application included in mobile device400B. As another example, a mobile device400B executing an application may recognize a single sound or a particular sequence of sounds from a speaker, user (e.g., voice), musical instrument, tone generator (e.g., a tuning fork), environmental noise, thunder, explosions, etc. In certain embodiments, a mobile device may be configured to recognize and respond to input from only specific light sources based on characteristics of the light (e.g., color, hue, intensity, etc.), or from only specific sound sources based on characteristics of the sound (e.g., tone, pitch, volume, etc.)
FIG. 5A shows an example of amobile device500 configured to perform one or more functions in response to a user signal, such as a user action or a subject action (e.g., one or more gestures or other movements and/or sounds). In some embodiments, as shown, themobile device500 may be identical or similar in any respects to other mobile devices described herein. For example, themobile device500 may include a flash component and a camera (not shown). Anapplication520 may be installed on or provided with themobile device500. Theapplication520 can configure themobile device500 to recognize one or more specific gestures515 (e.g., arm movements, hand movements, etc.), one or more sounds, or other actions of a subject or auser510. For example, the camera of themobile device500 may record a stream of video or record images according to some predetermined or dynamically determined interval. Theapplication520 or some other module or component of the mobile device (e.g., a “listener” service operating in the background that detects an event in data and triggers another application or service in response to detection of the event) can process the camera input and determine whether it contains evidence or an indication of a subject or auser510 providing some predetermined signal. Upon recognition of the signal, theapplication520 can cause themobile device500 to perform some action, such as activating a photo timer; capturing an image; emitting a flash; activating, deactivating, and/or changing the hue and/or intensity of a photographic light, filter, and/or reflector; launching an application; and/or performing any other function that themobile device500 is capable of performing. In some embodiments, themobile device500 may provide feedback to theuser510 indicating that the subject or user signal has been recognized and/or that a particular function has been performed. For example, themobile device500 may emit aflash554 or a sound, and/or display an image on a screen.
FIG. 5B is a flow diagram of an illustrative process for implementing a signal-recognition (e.g., from a user or a subject) feature on amobile device500. Atblock550, a user or a subject may initiate the application520 (or some portion thereof) on themobile device500. In some embodiments, the operation may be initiated by a user or othermobile device500 performing or providing one or more signals that themobile device500 is preprogrammed to recognize. For example, themobile device500 may monitor a scene including a subject oruser510, the subject or user may perform aspecific gesture515 that theapplication520 and/ormobile device500 recognizes as a signal to initiate one or more functions programmed intoapplication520. Atblock552, the mobile device can detect the signal, such as thespecific gesture515. Theapplication520 can optionally provide a first feedback, such as a flashing light to indicate that the device has detected and recognized the signal. The light may flash according to some specific pattern or sequence to convey recognition of the signal to the user. Atblock554, themobile device500 can perform the action that corresponds to the signal. For example, after theapplication520 and/ormobile device500 detects the signal, theapplication520 may cause themobile device500 to perform the one or more functions corresponding to the signal, such as but not limited to taking a picture, turning on the flash, turning on themobile device500, or operating any application or function associated with the specific signal. Atblock556, the mobile device can optionally provide a second feedback to the user, such as a flash or sequence of flashes indicating that the function has been performed. The second feedback may be the same as the first feedback indicative of detecting the signal. The second feedback may also or alternatively be different than the first feedback, such that a user may be able to distinguish between the separate feedbacks.
FIGS. 6A-6O and7A-7-7E illustrate a smart lighting and/orsmart flash system600 that can automatically adjust one or more operational parameters (e.g., the physical position and/or orientation of the lighting or flash system, characteristics of the light to be emitted, etc.) based on information obtained from sensors in the lighting orflash system600, from amobile device610, and/or from other data sources. In some embodiments, as shown inFIGS. 6B and 6C, the lighting orflash system600 may be a stand-alone device that is configured to provide illumination for use in photography, videography, and other situations. A separatemobile device610, such as a mobile device with an onboard camera lens or a mobile device configured to use the camera system described above with respect toFIG. 1, may communicate with the lighting orflash system600 in order to obtain desired lighting or flash for the current situation. For example, amobile device610 may determine the distance between themobile device610 and the subject to be photographed. Themobile device610 may also determine the distance between the lighting orflash system600 and the subject. Themobile device610 can calculate desired operational parameters for the lighting orflash system600, such as intensity, duration, hue, and/or direction, etc., and transmit information about the desired lighting or flash parameters to the lighting orflash system610. The lighting orflash system610 can then implement the desired operational parameters and emit an optimal or desired flash.
Referring toFIG. 6A, the lighting orflash system600 may include ahead portion602 that is movable with respect to abase portion606. Thehead portion602 can house aflash element604, such as an LED, a xenon-based bulb, or some other flash element known to those of skill in the art. The lighting orflash system600 may include various sensors or other components for determining orientation, position, and/or other information, etc. For example, the lighting orflash system600 may include a gyroscope, accelerometer, a local positioning system module, a global positioning system (“GPS”) module, and/or a compass. Information about the lighting orflash system600 can be obtained via the sensors, and provided to a computer, such as amobile device610 or an internal or onboard processor. The lighting orflash system600 may also include one or more adjusters, such as one or more servomechanisms (“servos”) or motors for implementing physical adjustments (e.g., the position and/or orientation with respect to the subject or scene, and/or other characteristic of the lighting or flash system600), as shown inFIG. 6O. The lighting orflash system600 may include a battery to power the sensors, adjusters, flash element, lighting element, and/or other electrical components. In some embodiments, as shown inFIG. 6M, the lighting orflash system600 may include apower cable608 to draw electrical power from amobile device610 or from a standard power source (e.g., a wall outlet).Cable608 may also facilitate data connectivity and control of lighting orflash system600 by themobile device600. As shown inFIG. 6M, the lighting orflash system600 may utilizecable608 to powermobile device600. In some embodiments, the lighting orflash system600 may includeactivation input605, such as a button or other touch sensitive surface, for activating the lighting orflash system600 and electronics contained within the lighting orflash system600, as will be described in more detailed with reference toFIG. 7A.
As shown inFIGS. 6D-6K, the lighting orflash system600 may be held by a user during operation, placed on a surface (e.g., a table or floor), or mounted in a temporary or permanent location. For example, the lighting orflash system600 may be mounted to a tripod, headwear that may be worn by a user (e.g., a hat or helmet), other wearable mounts (e.g., a wrist mount, hand mount, necklace), a wall or ceiling, etc.
The lighting orflash system600 may communicate with themobile device610 via wireless means, as shown inFIGS. 6B,6C, and6L, such as wireless signals transmitted in accordance with the Bluetooth® standard or using other wireless techniques described herein or known to those of skill in the art. In some embodiments, the lighting orflash system600 may communicate with themobile device610 via a wired connection, such as acable608 that is coupled to a port of theflash system600 and a corresponding port of themobile device610, as shown inFIG. 6M. In some embodiments, illustrated inFIG. 6C, multiple lighting orflash systems600 may communicate with a single mobile device, multiple mobile devices may control a single lighting or flash system, multiple lighting or flash systems may be used with multiple mobile devices, etc. For example, a user may use a mobile device control multiple lighting orflash systems600, having each of the lighting orflash systems600 emit a flash in a desired sequence or simultaneously, depending upon the needs of the user.
FIGS. 7A and 7B illustrates an example of a lighting orflash system600. The lighting orflash system600 andmobile device610 may execute an initial startup or synchronization procedure whereby an application on themobile device610 determines a starting position and orientation for themobile device610 and lighting orflash system600. For example, the lighting orflash system600 may be placed near or touched to themobile device610 so that the two devices occupy substantially the same space. In some embodiments, lighting orflash system600 includesactivation input605, such that a user may operate theactivation input605 to power on the lighting orflash system600 and generally simultaneously, prior, or following operate the application onmobile device610 to locate, communicate, and synchronize lighting orflash system600 withmobile device610. Information from the sensors of theflash component600 may be provided to themobile device610 so that the sensors on the two devices can be synchronized and a starting location for each can be determined. For example, as illustrated inFIGS. 7C-7E, theflash component600 may be positioned in multiple locations relative to themobile device610. Theflash component600 may send sensor readings or calibration signals, such as a signal or sequence of flashes of light, tomobile device610 at each of the multiple locations. Themobile device610, or application therein, may receive the sensor readings or calibration signals to further synchronize and calibrate control and operation offlash component600. The subsequent sensor readings can be compared to initial measurements in order to determine how the position or orientation of the lighting orflash system600 has changed. A user may activate some input control (e.g., a button displayed by an application) to begin use of the lighting orflash system600, such as beginning a lighting or flash “session.”
The user may then begin using themobile device610 and lighting orflash system600 to take photographs. The lighting orflash system600 may detect its current position and orientation using one or more internal sensors and/or data obtained from amobile device610. For example, the lighting orflash system600 can use a gyroscope, accelerometer, compass, and/or other sensors to determine a vertical position (e.g., height) or a change in vertical position, a horizontal position or a change in horizontal position, a direction (e.g., north/south/east/west), and/or an orientation (e.g., tilt). The lighting orflash system600 can transmit data regarding the current position and orientation to themobile device610 at the request of the mobile device610 (e.g., in response to a command initiated by an application executing on the mobile device600), according to some predetermined or dynamically determined schedule, in response to some event (e.g., in response to detecting a change in position exceeding some threshold), or the like.
Referring toFIG. 7F, themobile device610 may include an application that can calculate and transmit information regarding the optimum or desired position and orientation of the lighting orflash system600 with respect to aphotographic subject700. In some embodiments, themobile device610 can determine a distance between the lighting orflash system600 and the subject700 to be photographed using triangulation. Themobile device610 can determine the location of the subject700 to be photographed using information about the location and orientation of the mobile device610 (e.g., obtained using a sensor, GPS unit, compass, etc.) and information about the distance between themobile device610 and the subject700 to be photographed (e.g., based on information determined during auto-focus processing). Themobile device610 can also determine the location of the lighting orflash system600 based on the information obtained from the lighting orflash system600 as described above, in reference toFIGS. 7C-7E. Once the locations of themobile device610, lighting orflash system600, and subject700 to be photographed are determined, themobile device610 can determine optimal or desired parameters for the lighting orflash system600 and instruct the lighting orflash system600 accordingly.
The lighting orflash system600 can activate an adjuster, such as a servo (e.g., rotary actuator, linear actuator) to adjust the angle of thehead portion602 with respect to thebase portion606, and therefore to adjust the angle of theflash element604 with respect to thephotographic subject700. Alternatively or in addition, themobile device610 can sense, calculate, solicit from the user, and/or transmit information regarding the optimum or desired lighting or flash characteristics to the lighting orflash system600. The lighting orflash system600 can then adjust the color, hue, intensity, duration, and other light-related or flash-related parameters. The lighting element or flash element can then be controlled and/or triggered from themobile device610, such as when themobile device610 is taking a picture.
As shown inFIGS. 8A and 8B, the position and orientation of the lighting orflash system600 may change from a first distance710 between the lighting orflash system600 and aphotography subject700 to a second distance712. The angle720 formed by the lighting orflash system600,photography subject700, andmobile device610 may also change to a second angle722. Sensor readings or other information regarding the current position and/or orientation of the lighting orflash system600, and/or sensor readings or other information regarding the change in distance and/or angle with respect to the subject700, may be transmitted from the lighting orflash system600 to themobile device610. Themobile device610 can then determine various modifications to operational parameters of the lighting orflash system600 to achieve one or more optimal or desired lighting effects, such as any of those described elsewhere herein. In some embodiments, the position of themobile device610 may change such that the distance between themobile device610 and the subject700 is changed. Themobile device610 can determine various modifications to the orientation or other operational parameters of the lighting orflash system600 to achieve one or more optimal or desired lighting or flash effects, as described elsewhere in this specification. In some embodiments, a user may change the orientation of themobile device610 in order to photograph or record a different subject. Themobile device610 can triangulate or otherwise determine the distance between the lighting orflash system600 and the new subject based on information inputted or received by the user and/or other information as described elsewhere herein. Themobile device610 can then determine various modifications to the orientation and/or other operational parameters of the lighting orflash system600 in order to achieve one or more optimal or desired lighting or flash effects with respect to the new subject.
In some embodiments, themobile device610 may calculate one or more optimum or desired operating parameters for the onboard camera and/or flash based on state information associated with the various devices and/or environmental factors, etc. For example, as illustrated inFIG. 6N, an application on themobile device610 can determine an optimum or desired time at which to activate the lighting orflash element604 and/or the camera of the mobile device based on an analysis of “shake” caused by a user's body (e.g., an unsteady hand). The application may use information from an internal accelerometer of the mobile device to determine or predict the shaking of themobile device610. Based on that determination, the application can time the camera shutter so that it takes the photo at a desired “still” moment (no movement or lower-rate movement period). A flash emitted by theflash system600 can be coordinated to also fire at the proper moment. As a result, the application may delay the photo capture, such that the application may not necessarily take the photo at the instant when the user presses the shutter button, but instead at some time thereafter (e.g., a second later, or a fraction of a second later), once the application has determined that it is a preferred or optimal moment to take the picture.
In some embodiments, the lighting orflash system600 may process sensor information and determine appropriate adjustments to its own operational parameters, rather than receiving instructions or adjustments from amobile device610. The lighting orflash system600 may comprise one or more sensors to enable the lighting orflash system600 to “be aware” of where it is in relation to themobile device610 and thephotograph subject700, such as by automatically triangulating itself, to determine a preferred or optimal timing and direction in which to actuate the lighting and/or the flash based on sensors in the lighting orflash system600 and/or data obtained from themobile device610, etc.
FIGS. 9A-9O illustrate an embodiment of a lighting orflash system900 with multiple (e.g., two or more) individual lighting orflash elements904. The lighting orflash system900 may be similar or identical in any respects to the lighting orflash system600 described elsewhere herein. For example, the lighting orflash system900 may include ahead portion902 and abase portion906. Thebase portion906 may include amating body907, as illustrated inFIG. 9B, designed for interchangeability ofhead portions902, such that a user may change between different types ofhead portions907 based on the desired use. Themating body907 may be an interlocking mount that permits a sliding motion of thehead portion902 along themating body907, and once in proper alignment, thehead portion902 may lock or be securely held in place relative tobody portion906 by themating body902. The various lighting orflash elements904 may be positioned on thehead portion902 such that individual lighting orflash elements904 or groups of lighting orflash elements904 may be selected to actuate or fire based on the direction in which emission of a light or flash is desired. In some embodiments, as shown, thehead portion902 may be spherical or substantially spherical. Individual lighting orflash elements904 may be positioned about thehead portion902 to emit light in different directions.
In use, the lighting orflash system900 can provide (e.g., in a wired or wireless transmission) information to amobile device610 about the current position and/or orientation of the lighting orflash system900, and/or any other information about the lighting orflash system900 and/or existing lighting conditions or other conditions relating to a subject or scene to be photographed. Themobile device610 can determine which individual lighting orflash elements904 should be actuated in order to achieve an optimal or desired lighting or flash effect. Themobile device610 can transmit instructions to the lighting orflash system900, and the lighting orflash system900 can actuate theappropriate flash element904 or group offlash elements904. In this way, thehead portion902 does not need to be rotated or angled with respect to a photographic subject. Instead,specific flash elements904 can be activated on demand or instantly or much faster than if a motor or servo had to re-orient thehead portion902 with respect to the photographic subject. Thus, faster response time can be achieved, resulting in fewer lost opportunities or sub-optimal photos or videos. In some embodiments, various operational parameters of theflash elements904 may be modified to improve lighting, such as color, intensity, and the like, similar to the modifications described elsewhere herein with respect to the lighting orflash system600. The operational parameters of theflash elements904 may be synchronized, or operational parameters ofindividual flash elements904 may be set independently of one another to provide additional flexibility and lighting effects.
Although this invention has been disclosed in the context of certain embodiments and examples, it will be understood by those skilled in the art that the present invention extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses of the invention and obvious modifications and equivalents thereof. It is also contemplated that various combinations or subcombinations of any specific features and aspects of any embodiments may be combined with any specific features of any other embodiments, which still fall within the scope of the invention. Accordingly, it should be understood that various features and aspects of the disclosed embodiments can be combined with or substituted for one another in order to form varying modes of the disclosed invention.