CROSS-REFERENCE TO RELATED APPLICATIONSThis application claims the priority of U.S. provisional patent application 61/600,639, filed Feb. 18, 2012, which is incorporated herein by reference.
COPYRIGHT NOTICEPortions of the disclosure of this patent document contain material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the U.S. Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
BACKGROUND OF THE INVENTION1. Field of the Invention
The invention relates to a method and an arrangement for the display of images, static or dynamic, underwater and for a method and system for controlling the lighting display of same alone or in coordination with changes in the ambient underwater lighting.
2. Background of the Invention
Underwater lighting has advanced over the course of many years, improving in safety along with the quality of visual effects. From the original incandescent white lights used solely for illumination of pools in poor lighting conditions, leaps in technology have now moved to the most modern and technically advanced water shows of today. The desire for better lighting and improved or enhanced effects has driven the market to these improvements. Static lighting in and around the pool or bodies of water come in an increasingly dazzling array of colors and perfusions with various optics and heat management devices and methods to provide ever more vibrant and ambitious displays of color in water.
Examples of some accent or background lights include
|
| # 6,472,990 | Delmar |
| # 7,357,525 | Doyle |
| # 7,410,268 | Koren et al. |
| # 7,699,515 | Faber |
| # 7,740,367 | Koren |
|
Additionally, there exist devices that are exterior to a pool or body of water for exciting visual displays or to provide enhanced pool safety, these include:
| |
| # 4,196,461 | Geary |
| # 4,887,197 | Effinger |
| # 5,576,901 | Hanchett |
| # 6,278,373 | Jaurigue et al. |
| # 6,951,411 | Kumar et al. |
| # 7,810,942 | Kunkel et al. |
| # App. 2005/0146777 | Russell |
| |
Similarly, several devices have been suggested that incorporate lighting, culminated or otherwise, in bodies of water, some examples include:
| |
| # 5,934,796 | Quereau |
| # 6,086,215 | Giattino, et al. |
| # 7,482,764 | Morgan et al. |
| # 7,717,582 | Longoria et al. |
| # 7,744,237 | Potucek et al. |
| App. # 2010/0019686 | Gutierrez, Jr. |
| |
However, to date, no commercially available application has been made of projection and illumination of graphics underwater or in a pool or body of water as a part of a display or as an illumination schema from under the water. Thus, there exists a need for a device and associated method of allowing for presentation of images and/or graphics from an underwater position to a point in an underwater environment. Such a device should meet the safety requirements for existing underwater lights, be compact, and provide pleasing images and/or graphics. Additionally, there exists a need for methods that allow for control of the graphics and ambient lighting such that they are synchronized to realize desired visual effects and adjustment of the projected images and/or graphics. The combination of graphics or imaging lighting and ambient or non-imaging lighting control and display should allow for strong contrasts in displays, providing heretofore unknown contrast and clarity in underwater projection.
SUMMARY OF THE INVENTIONThe invention includes a method, a system, an apparatus, and an article of manufacture for underwater image display system and lighting control system.
An aspect of the invention is to provide a device and associated methods that allow for presenting images/graphics in underwater environments in a controlled fashion.
Another aspect of the invention is the projection of the applied image may using either the walls or bottom or both surfaces of the body of water, water feature, pool or spa for the projection of images, the spatial size and intensity of the graphics being controlled to accommodate for the projection surface and for the water through which the projection is being made.
A further aspect is to accommodate underwater projection of rendered graphics using techniques of raster imaging using systems like image projection systems such as LCOS, LCD, DLP, hybrid and laser projection technologies or vector graphics projection systems such as laser based beam steering or other steering and/or modulation devices either in monochrome or multi-color illumination.
A still further aspect is to provide for user input and communications through wired or wireless systems and provide a power input for both commercially available power inputs for bodies of water such as 12VAC/24VAC systems or for 120V outputs with a low voltage conversion safe for use in and around bodies of water.
Yet a further aspect is using vector generated graphics where the device may produce images that are monochrome or multicolor and multicolor images that are generated by using laser or LEDs or other light sources of multiple colors where the colors may be blanked using mechanical or electronic methods and may be combined using optics or discretely made available to the beam steering or modulation mechanism.
A further object is to incorporate an underwater system that uses dispersion gratings, for example but certainly not limited to reflective or transmissive gratings, to create images when coherent light is incident on the grating.
A further aspect is to incorporate an underwater system that uses a spatial light modulation system, again for example but certainly not limited to reflective or transmissive systems, that can modulate the wave front to create images and/or patterns when coherent light is incident on the spatial light modulation system.
A still further aspect is to include an enclosure that allows for the device to be mounted within the confines of existing pool wall or boat hull mounting techniques.
Yet a further aspect of the invention is employing an additional image steering device beyond the initial projected image steering device to move the steered, projected image around within the confines of the body of the water feature, pool or spa.
Another aspect of the device and method of operating underwater is a device or method capable of producing a plurality of vector graphics in a plurality of colors that work in combination with embodied light sources, for instance but certainly not limited to LEDs that are capable of producing a plurality of colors to serve as underwater ambient lighting.
A further aspect is a method allowing for control of the graphics and ambient lighting such that they are synchronized to realize desired visual effects with a combination of graphics and ambient lighting that allows for strong contrasts, for example green graphics with violet ambient light, red graphics with blue ambient light and the like.
Yet another aspect of the invention is a method and device or system for allowing for communication in a system where the image or graphic light generating devices are discrete from the ambient light generating devices where both or one of them is underwater.
Yet another aspect of the invention is a method that uses a plurality of lenses to achieve a desired beam spread and focus.
Another aspect of the invention is the use of combined lenses with a transparent cover such that the combined lenses and cover provides waterproofing and the desired optical divergence/convergence characteristics.
Yet another aspect of the invention is a method that uses the power supply line or power toggling to cycle through sequences allowing for control of both, either together or independently, the ambient lighting color and/or intensity and image or graphics projection and sequencing.
The above aspects and advantages of the present invention are achieved by a method, an apparatus, and an article of manufacture for an underwater image projection system submerged in a body of water and projecting an image within said body of water.
The apparatus of the invention includes an underwater image projection system submerged in a body of water and projecting an image within said body of water having an enclosure, a lens assembly, with an at least projection element with an at least one projected light source projecting an image within said body of water. An at least one light source steering or modulating device is included and a system controller is coupled to and controls the at least one ambient light source, the at least one projected light source steering or modulating device and the at least one projected light source. An at least one further image steering device is also provided. A user input device is provided wherein a user inputs image data to the controller through the user input device and the controller interprets the image data into a set of image control variables and executes control the at least one projected light source in coordination and the at least one projected light source steering device with each other and projects the image through the projection element with the projected light source by controlling the at least one projected light source steering or modulating device and controlling the movement of the at least one further steering device to project from underwater a static or animated image on an underwater surface of the body of water.
An at least one ambient or non-projected light source can also be provided to operate in conjunction with the at least one projected light source. The at least one projected light source can be at least one of an at least one laser, incandescent bulb, halogen bulb, LED, HLED, gas discharge lamp, high intensity discharge lamp or the like for example. The at least one light source steering or modulating device can be at least one of an at least one motor with mirror, galvanometer with a mirror, galvanometer with a dichroic reflector, DLP device, LCOS device, LCD device, D-ILA device, SXRD device, and laser diode device or the like for example.
The underwater image projection system can include a lens element that covers an interface of said underwater image projection system with said body of water and allows light emitted from said underwater image projection system to pass into said body of water. The lens element can contain an at least one optic section, wherein the at least one optic section modifies the direction, shape, pattern, color, or wavelength of the emitted light from said underwater image projection system. The at least one optic can be a divergent optic. The at least one optic can be a convergent optic. The at least one optic can be a grating. The lens element can be interchangeable with lens elements having different optical properties to affect the light emitted by said underwater image projection display system.
The underwater image projection system can have software on the controller that allows a user using the user inputs to define the display area and parameters for display of the image. It may also have software on the controller that allows a user through the user inputs to define a pre-programmed operational call for the display of the image in combination with control of the ambient lighting. The operational call can be at least one of operation calls for display of a moving image or series of images controlling the image and non-image light sources of the instant invention or operation calls for display of a static image or images (such as in a slideshow) controlling the image and non-image sources of the instant invention or lighting shows that control and display light only effects controlling the non-image and/or image sources of the instant invention or an operation call for a multi-media presentation controlling both light sources of the instant invention and off-board elements, such as water features and sound systems.
The user input device can further comprise a user interface controller. The user interface controller can be remote from the enclosure. The user interface controller can be electronically coupled through a wireless or wired coupling to the system controller. The user interface controller can permit a user to select an operational call. The user interface controller can be provided with a graphical user interface for selection of image data. The user interface controller can have an input allowing a user to input image data from a computer readable media or through a wired or wireless coupling with a network. The user input image data can be stored on the user interface controller or the system controller.
The operational call can also cause said underwater image projection to communicate with additional system controllers controlling at least one of a further light source, a further water feature, a further video display, a further sound system, a further underwater image projection system, and a further pool or spa jet control system. The underwater image projection system can also include a master controller in communication with the systems controller. The master controller can receive programmed instructions and respond by sending signals to the underwater image projection system to display the image through the system controller and communicates further instructions to further display elements or control elements in communication with the body of water. The master controller alone or in conjunction with a user interface can download new operational calls and instructions from a network.
The at least one laser light source can have sensors monitoring laser light output variables or the safety system further comprises sensors monitoring laser light output variables or the safety system observes variables from the beam steering or modulation system and if abnormalities are detected the safety system shuts down the at least one projected light source. The underwater image projection system can have a system to measure the ambient light conditions in the body of water and uses measurements from this system to control image or non-imaging light sources.
The enclosure can further comprise an at least one secondary enclosure, the enclosure having the at least one projection light source and at least one projected light source steering device contained therein and the at least one secondary enclosure having the at least one ambient or non-projecting light source therein. The enclosure can contain the at least one projection light source and at least one projected light source steering or modulation device. The enclosure can be watertight.
The underwater image projection system can also include an at least one heat sink for cooling the underwater image projection system. The enclosure can provide for a separate watertight enclosure portion while permitting exposure of the heat sink to water from said body of water or has a thin walled section to promote cooling of the heat sink through said section.
The master controller can be outside the body of water and the enclosure. The master controller can be separate from a user interface controller having a user interface with the user input thereon and both are outside the body of water and the enclosure. The master controller can be separate from the user interface controller having a user interface with the user input thereon and both can be outside the body of water and the enclosure. The master controller can further control an at least one of an at least one sound systems, boat controls, water features, bubblers, fountains, waterfalls, laminar jets, water effects, accent lights, pool lights, water special effects, pyrotechnic controls, lights and pool controls. The underwater image projection can further include an at least one user interface controller having a user interface with the user input thereon. The user interface is outside of the enclosure and coupled to the system controller to provide control inputs.
The underwater image projection system can include a further lens element separate from the lens element in contact with said body of water and having an at least one optic section, wherein the at least one optic section modifies the direction, shape, pattern, color, or wavelength of the emitted light from said underwater image projection system. The underwater image projection system can also include an ambient light sensor sensing the light ambient light condition in the body of water. The user input can include an at least one switch which cycles through a selection of pre-programmed images and operational calls modifying the display of the pre-programmed images. The underwater projection system can further include a separate remote user interface providing wireless communication with the underwater image projection system, wherein the at least one switch is located thereon.
The method of the invention includes a method for controlling underwater projection of images in or from a body of water, comprising inputting an image into a controller; interpreting the image into executable control data through the controller; controlling an at least one underwater image projection device in the body of water with the controller using the control data; controlling an at least one non-imaging device in conjunction with the at least one image projecting device in the body of water with the controller using the control data; and displaying the image in the body of water.
The method of controlling an underwater projection of images in or from a body of water of claim where the step of controlling an at least one underwater image device can further comprise controlling multiple underwater image devices in a synchronization with one another. The step of controlling an at least one non-imaging light source can further include controlling multiple non-imaging light sources in synchronization with one another and the at least one image projecting device. The method of controlling an underwater projection of images can further include inputting through a user interface by a user a program or a pre-programmed input of an at least one operational call into a controller for the at least one underwater projection device, the operational call providing instructions on manipulating the control data and synchronizations for the operation of the at least one underwater projection device and the at least one non-projection light source to provide the desired visual display encoded in the operational call.
The method of controlling an underwater projection of images in or from a body of water can further comprise controlling an at least one off-board feature in the body of water or outside the body of water as part of the operational call.
A computer system for performing the method and a computer-readable medium having software for performing the method of claim controlling an underwater projection of images in or from a body of water are also claimed.
The method of the invention includes a method of operating an underwater image projection display system comprising: Starting an image projection display system, engaging an at least one startup code segment on a controller, powering up, and testing the underwater image projection display (UID) system through a user input or interface or remote command; Inputting an image by engaging an at least one input code segment on a controller which produces input image data which is processed by image data deconstruction and compilation code segments on a controller resulting in image controller readable data that is stored or transmitted or stored and transmitted to/in the UID system together with an image display operational call; Translating and/or compiling by engaging an at least one translation and/or compilation code segment on a controller which translates and/or compiles the stored image data through a set of instructions corresponding with the operations call for the type of display into machine readable instructions for the UID system components and any additional systems as indicated by the operations call; Transmitting and commanding by engaging an at least one transmission and control segment on a controller, which transmits the machine readable command instructions to the drivers of the components of the UID system based on the operations call to activate the UID system and any additional controlled systems depending on the operations call; Projecting a resulting image through an at least one code segment on a controller controlling the hardware of the UID system and the image data, image controller data, additional data, and any additional selected variables or adjustments based on user input or sensors resulting in the projection of an image/graphic, either static or in motion, or desired color effects with coordinated outputs for the additional controlled elements for the selected operations call; and Resetting or looping the UID system by engaging an at least one reset/loop code segment on a controller allowing the user to adjust, reset, or loop the operations call to allow for changes in sequencing of an image(s) or to add multiple operations calls in the UID system display or adjusting the start or ending of a projection or to allow for a reset, shutdown, and restart of the image projection system via input from the user.
The apparatus of the invention includes a computer including a computer-readable medium having software to operate the computer in accordance with the invention.
The article of manufacture of the invention includes a computer-readable medium having software to operate a computer in accordance with the invention.
Moreover, the above aspects and advantages of the invention are illustrative, and not exhaustive, of those which can be achieved by the invention. Thus, these and other aspects and advantages of the invention will be apparent from the description herein, both as embodied herein and as modified in view of any variations which will be apparent to those skilled in the art.
BRIEF DESCRIPTION OF THE DRAWINGSEmbodiments of the invention are explained in greater detail by way of the drawings, where the same reference numerals refer to the same features.
FIG. 1A shows a plan view of an exemplary embodiment of the instant invention in a body of water projecting an image on a curved surface.
FIG. 1B shows a plan view of an exemplary embodiment (not to scale) of the instant invention in a body of water projecting an image into a body of water.
FIG. 2 shows a cross sectional view of an exemplary embodiment of the underwater light projection display system of the instant invention.
FIG. 3 shows an isometric view of an exemplary embodiment of an image projection device
FIG. 4 shows a plan view of a method of operation of an exemplary embodiment of the invention.
FIG. 5A shows a flowchart for an exemplary embodiment of the operation of software on an article of manufacture and a method of operating an underwater image display system.
FIGS. 5B and 5C show a flowchart of a further exemplary embodiment of the operation of software on an article of manufacture and a method of operating an underwater image display system.
FIG. 6 shows a system block diagram of a further exemplary embodiment of the instant invention.
FIG. 7A shows a front view of a further exemplary embodiment of the instant invention utilizing an underwater projector as installed in the body of water.
FIGS. 7B and 7C show a front and a cross-sectional view respectively of a further exemplary embodiment of the instant invention using an underwater projection device and a further image steering mechanism.
FIG. 8 shows a further exemplary embodiment of the instant invention utilizing a DLP based device.
FIG. 9 shows a further exemplary embodiment of the instant invention utilizing a DLP in a small form factor.
FIG. 10 shows a still further embodiment of the instant invention having multiple non-imaging light sources.
FIG. 11 shows a plan view of yet a further exemplary embodiment of the instant invention having multiple image projection devices and multiple non-imaging light sources.
FIG. 12 shows a plan view of a further exemplary embodiment of the instant invention coupled to and communicating with further display elements.
FIG. 13 shows a flowchart of an exemplary embodiment of a method of adjusting an image projected underwater to a point in a body of water to conform to non-uniform display surfaces in the body of water.
DETAILED DESCRIPTION OF THE INVENTIONThe instant invention is directed to an underwater projection system and controller for controlling image lighting and non-image lighting while projecting to a point in the body of water. The image projection system has an enclosure, an at least one image lighting element/source, an at least at least one beam/line/projected light source steering device, an at least one additional or further image steering device, an at least one system controller and at least one user input or user interface. The user enters an image or image selection through the user input or interface, the system analyzes and converts the image to machine instructions and projects the image through the image projection system and optionally coordinates this in conjunction with the operation of the non-imaging lighting as disclosed herein below.
FIGS. 1A and 1B show plan views of an exemplary embodiment of the instant invention in a body of water projecting an image into a body of water. The exemplary embodiment of the underwater imageprojection display system10 is shown with anenclosure30 provided for use underwater in a body ofwater7, in this case a pool. The underwater imageprojection display system10 projecting animage1000 on atarget surface1010 in the body ofwater7, in this case aside5 and/orbottom2 of the pool. The term image, as used herein, encompasses any static or non-static image stored in any form that can be interpreted by the associated elements of the underwater image projection display system for display. Theenclosure30 can be mounted in a recess within the wall of the pool or body ofwater5 or separately mounted underwater (not shown). Theenclosure30 can be water tight or can provide for flow of water to specific cavities (not shown) to provide for additional cooling while allowing for an at least one watertight electronics compartment(s).
As seen inFIG. 1B, an at least one image projection element ordevice100 is provided. The exemplary embodiment is shown having a singleimage projection element100 which is coupled to an electrical source (not shown) through anelectrical safety connector20. In further embodiments, as described below, additional image projection elements may be added in the system. These additional projection elements may be engaged in the system or incorporated into the device without departing from the spirit of the invention. Similarly, additional embodiments may include a self-contained power source as a component of the enclosure.
Theenclosure30 also provides for alens element90. Thelens element90 may be clear allowing for direct transition of light emitted by theimage projection element100 and an at least one ambient or non-image projectinglight source104. Thelens element90 may also be comprised of several smaller lens elements or may have a pattern etched into it to affect the projection of light from theimage projection element100 as needed for proper projection of light emitted from theimage projection element100 or the ambient or non-projectionlight element104. Thelens element90 may be for instance, but certainly is not limited to, a divergent and/or convergent lens at a point for the image projection of light from the at least one image projection element so as to provide a necessary increase in field of projection or similar changes in the display of the projected elements into the body ofwater7. Thelens element90 can also be made to be interchangeable or replaceable without impacting the soundness or waterproofing of thehousing30. Further, as the image light projection is also steerable as discussed below, different effects may be included in different areas of thesame lens element90, as further discussed herein below in regards toFIG. 2.
As seen inFIG. 1B, asystem controller200 is provided to control the elements of the exemplary embodiment as shown. Thesystem controller200 manages and operates the elements of theimage projection device100 and is coupled to and communicates with auser interface controller50 in this exemplary embodiment. The system controller may also operate the entirety of the underwater imageprojection display system10 in further exemplary embodiments. Thecontrollers50,200 include software to operate themselves and the underwater imageprojection display system10. Software refers to prescribed rules to operate a machine, controller, or machine or device operated with a controller or driver. Some non-limiting examples of software include: software; code segments; instructions; computer programs; and programmed logic and the like.
The device orsystem controller200 is shown as a dedicated printedcircuit board201 with functional elements as discussed herein below. Thecontroller200 can also for example refer to, but is certainly not limited to, any apparatus that is capable of accepting a structured input, processing the structured input according to prescribed rules, and producing results of the processing as output. Some non-limiting examples of controllers include specialized control circuit boards, a computer, a portable computer; a networked computer system, a hybrid combination of a computer and an interactive controller; a tablet; a touch-screen based system; a smart phone; application-specific hardware to emulate a computer and/or software and the like. A controller can have a single processor or multiple processors, which can operate in parallel and/or not in parallel. A controller can also refer to two or more of these or other examples connected together via a network for transmitting or receiving information between the controllers, such as in a home or house network or a distributed computer system for processing information via controllers linked by a network and a cloud server or enabled cloud computing devices. A network may also include several projection devices or other display devices, such as but certainly not limited to, sound systems, boat controls, water features, fountains, water special effects, pyrotechnic controls, lights and similar devices.
Theuser interface controller50 in the exemplary embodiment is shown as coupled to and communicating with thesystem controller200 through awireless link45. Additional exemplary embodiments, like those shown below, provide for wired coupling and communication as well, for instance through an RS-485 wired connection. Theuser interface controller50 is used to provide input to the underwater imageprojection display system10 and provide for control by a user. The input can be accomplished through any number of mechanisms, from touch screens to keyboard to voice activation and similar user input devices. Theuser interface controller50 can be as advanced as a fully customizable touch display or may simply provide for limited number of inputs, such as an on/off switch and corresponding arrow buttons to interface with a display. In this exemplary embodiment a touch enabled tablet display is provided as theuser interface controller50.
Theuser interface controller50, the tablet in this case, is provided with an information storage device, in this casenon-volatile memory56, to store both agraphical user interface52 and software to interpret input from thegraphical user interface52 and its own processor with thememory56. An “information storage device” refers to an article of manufacture used to store information. An information storage device has different forms, for example, paper form and electronic form. In electronic form, the information storage device includes a computer-readable medium storing information such as, but certainly not limited to, software and data and similar electronic information. In further exemplary embodiments of the instant invention additional information storage devices, such as but certainly not limited to dedicated volatile or non-volatile memory, can be provided to receive and store images from users on theuser interface controller50.
This image data can be provided by a user via an external memory storage device or other computer readable media, through an interface, here shown asport54. It can also be provided through a wireless connection, a wired connection, or through a network connection either as discrete files or streamed as packets. For instance, a stream of packets over the World Wide Web or through a router or other device. Additionally, theimage data1100 as discussed herein below may also be transmitted and stored on thesystem controller200 to improve display handling efficiency. In this exemplary embodiment, theimage data1100 is transmitted to and stored on thecontroller200, as described in greater detail below, so as to provide no delay due to wireless connection issues.
It is also important to note that the surfaces within the body ofwater7 thatimage1000 is projected on, as shown in the exemplary embodiment, is non-uniform. That is the image surface has a curvature. Additional software on thecontrollers50,200 provides for input and control of points of adjustment of theimage1000, as described in further detail below. This allows the user and/or the system to adjust the display and compensate for both the conditions of the water being transmitted through and the surface the display is being shown on. This can be accomplished manually or automatically or through a combination of both by thecontrollers50,200 and the user. The instant invention can be trained to understand the limitations and shape of the targeted portions of the body of water.
FIG. 2 shows a front view of the exemplary embodiment ofFIG. 1, as seen in the figure,lens element90 has various patterns etched on it to provide desired optical characteristics. In the exemplary embodiment shown, asection92 is provided which has a grating that actively diverges theimage1000 being projected. Anothersection94 is provided that allows for a reflective transmission grating. Anothersection96 provides for a convergent optic. Thelens element90 may also be provided with a motor or may be manually moved to accommodate changes in lens or optical characteristics. Additional covers may be provided and substituted for the one shown that provide alternative optics. Further, other exemplary embodiments may provide that thelens element90 may itself be entirely directed to a single optic or optical effect without departing from the spirit of the invention.
FIG. 3 shows a cross-section view of the exemplary embodiment of the instant invention. Theimage projection element100 has an at least one image projection or projectinglight source102. Theimage projection element100 is also provided with an at least one beam/line/projected light source steering or modulatingdevice150. In this instance the beam/line/projected light source steering or modulatingdevice150 is coupled to an at least onemotor152 with a mechanical coupling/transmission mechanism155. The beam/line/projected light source steering or modulatingdevice150 is also coupled to asystem controller200. Thesystem controller200 is provided with software comprised of specific code segments to drive and control the beam/line/projected light source steering or modulatingdevice150 to steer the output of theimage projection element100, as discussed herein below, which results in animage output1000 which can be static or moving/animated. The components of theimage projection element100 are mounted to an at least onechassis125 and coupled to thecontroller200 as shown. Thecontroller200 is thermally coupled to aheat sink25 to provide for cooling of thecontroller200 and theimage projection device100.
In the exemplary embodiment shown, the at least oneimage projection element100 is shown as a single projection element or projector. There can be any number of projectors or projection elements within thehousing30 or operating together in the same body of water orwater feature7 attending to image projection duties without departing from the spirit of the invention, but for illustration purposes reference is made to the non-limiting exemplary embodiment ofFIG. 1. Theprojection element100 has an at least one projection light source or projected light source or graphiclight source102. In this instance the projectedlight source102 is a single class 3 LASER. Other non-limiting examples of projected light sources include but are not limited to incandescent bulbs, halogen, LEDs, HLEDs, gas discharge lamps, high intensity discharge lamps and the like. These can be individual projected light sources or combined into a projection device, such as DLP, LCOS, LCD, D-ILA, SXRD, laser diode, spatial light modulation systems or the like. There may be any number of projected light sources; the example shown is a non-limiting example of the type and number of projected light sources. Further embodiments are disclosed herewith as further non-limiting examples of exemplary embodiments of the instant invention that utilize multiple light sources. The projectedlight source102 is coupled to thecontroller200 and controlled therewith as further described below.
The projectedlight source102 is mounted on thechassis125. The chassis also mounts the at least one projected lightsource steering device150 with a mechanical coupling/transmission mechanism155 coupling to the at least one motor159. In the exemplary embodiment shown inFIG. 1, the at least one beam/line/projected lightsource steering device150 is shown as a first, panning beam/line/projected lightsource steering device152 coupled to and controlled by thesystem controller200 and a second, yaw beam/line/projected lightsource steering device154 coupled to and controlled by thesystem controller200.
Each of the first and second beam/line/projected lightsource steering devices152,154 are shown as mirrors for directing or steering the projectedlight source102. Each of the beam/line/projected lightsource steering device152,154 are coupled via arespective shaft member156,158 to a motor159, in this exemplary embodiment each beam/line/projected light source steering device having its own motor. The motors159 are coupled to the controller and are activated by thesystem controller200 to direct or steer the projectedlight source102 in accordance with the software on and methods of controlling and operating thesystem controller200 as described herein below. Similar embodiments may utilize a galvanometer to translate and rotate the pitch and yaw respectively in the exemplary embodiment.
Additional embodiments may utilize fewer beam/line/projected light source steering devices, for instance a single faceted ball or a single drive motor with a more complicated transmission system may be utilized to provide for both pan and yaw motion or additional axis of control, all of which may also be provided without departing from the spirit of the invention. Additionally, some other non-limiting examples of projected light source steering or modulating devices can include but are not limited to reflectors, dichroic reflectors, mirrors, DLP, LCOS, LCD, D-ILA, SXRD, laser diode, and similar devices. The number and examples of the projected light source, the beam/line/projected light source steering devices, and motors can be varied without departing from the invention.
In addition to the projectedlight source102, the instant invention is used in combination with an at least one ambient or non-projectedlight source104 to assist in projecting graphics underwater. In this instance, the at least one ambient or non-projectedlight source104 is shown as a ring of high brightness light emitting diodes (HBLEDs). The at least one non-projectedlight source104 is optional and may be independent of the instant invention, including independently controlled. The at least one non-projectedlight source104 in the exemplary embodiment shown is coupled to thesystem controller200 and is operated in conjunction with the projectedlight source102. The ambientlight source104 may also include additional lights not included in theenclosure30 with theimage projection element100. These light(s) being coupled to thecontroller200 and operated to support theimage projection element100 and the imaging duties it performs. The lights may include, for instance but are certainly not limited to, existing pool lights, after market accent lights, additional lighting and display elements such as fountains, bubblers, and the like, or similar sources of light within or without the body ofwater7.
In the exemplary ofFIG. 3, the non-limiting embodiment, the beam/line/projected lightsource steering device150 is moved such that a pixels or dot projected by theprojection element100 are steered or modulated at a high rate of speed to produce the resultingimage output1000, relying on the persistence of vision or phi phenomenon as the perception by the human eye to see the existence of an image even when the image has changed or been removed. The image projection element orsource100 and the at least one beam/line/projected lightsource steering device150 are controlled by asystem controller200 with software thereon. The software includes multiple sub-components or code segments for specific functional instructions for the elements of the device. The software and the controller are used to convert images1020, as discussed herein below, to imagedata1100 that is then interpreted and conveyed as instructions to the components of the underwater imageprojection display system10, as discussed in greater detail herein below. By quickly moving and tracing an image, the full image appears before the human eye. In the instant invention, the at least one beam/line/projected lightsource steering device150 rapidly moves the dot/pixel projected by theprojection element100 at a speed that results in theimage output1000 being perceived by the human eye as a persistent image either static or in motion on a target surface orportion1010 in the body ofwater7, for example but certainly not limited to a pool. Additional exemplary embodiments may utilize further projection means and principals to project the image sufficient for display from the underwater imaging system in the body of water.
In the exemplary embodiment ofFIG. 1, this movement is provided in a single axis or multiple axis as determined by the at least one beam/line/projected lightsource steering device150. The control of the at least one beam/line/projected lightsource steering device150 is through software on thecontroller200. Further discussion of the operation and implementation of the software and electronics are described in greater detail in relation toFIGS. 4-6.
FIG. 4 shows a flow chart for a method of operation of an exemplary embodiment of the invention. As noted above, either theuser interface controller50 or thesystem controller200 can be utilized to process image data. This may be done in stages on either or both controllers or may be isolated to operations on one or the other controller.
As noted above, in the exemplary embodiment shown inFIG. 1, thecontrollers50,200 have software for handling images1020 and translating the image1020 to imagedata1100 for further communication and translation to control outputs for the underwater imageprojection display system10. Theuser interface controller50 allows for input of data from an image1020 or series of images or moving images or video that are reduced to or provided asimage data1100 and stored in a computer readable format as a computerreadable source1110. Any standard format for image or video recording may be utilized. Some non-limiting examples include but are not limited to JPEG, GIF, Exif, TIFF3, PNG BMP, PPM, PGM, PBM, PNM, WEBP, CGM, SVG, AVI, MPEG, FLV, and similar image and video formats. Based on this compression and storage, the image1020 can be reduced into or provided asimage data1100. A first step in preparing for the operation of the instant invention then is the reduction of an input image1020 intoimage data1100 or provision of theimage data1100 to be transmitted as a computerreadable source1110.
Theimage data1100 is stored on an at least one computer readable storage media, for instance, but certainly not limited to, in the exemplary embodiment shown volatile RAM memory or an SD chip or the like as a computerreadable source1110. Alternatively, theimage data1100 may be transmitted by a wired connection or awireless connection1130 or through a network11040 to thesystem controller200 or on theuser interface controller50. In a further exemplary embodiment, thesystem controller200 has memory that provides for a library of pre-stored image data and operational calls. The library may be used in conjunction with further user input image data or may be limited to the images in the library only. In this embodiment, an at least one user input, for instance a power button, allows for toggling through the pre-programmed images and displays in the underwater imageprojection display system10.
In a first step after image acquisition, the image is collected from the computerreadable source1110 instep900. In programmable embodiments, code segments in the software on thesystem controller200 receive and store the image data instep910. As discussed above, the storage device may be located on either thesystem controller200 or theuser interface controller50 or can be an input from anetwork1140 or other wireless or wiredconnections1130 into a computer readable media such as a memory buffer or provided on a computerreadable media1120. The result is storage of theimage data1100 in a form usable by the underwater imageprojection display system10 and managed by code segments on either theuser interface controller50 or thesystem controller200 or both.
Instep920, the image data is stored in the underwater imageprojection display system10. Then code segments are utilized instep925 to interpret and extract theimage data1100 from thestorage step920. An operations call is made from the data input at theuser interface50. Operations calls or “shows” are sets of software or pre-programmed instructions for the display of theimage data1100. These can be pre-programmed, downloaded, or entered by a user. Customizations and alterations may be provided for in some embodiments. The operations call data is sought atstep920 some examples of operations calls are operation calls for display of a moving image or series of images controlling the image and non-image light sources of theinstant invention922; or operation calls for display of a static image or images (such as in a slideshow) controlling the image and non-image sources of theinstant invention923; or lighting shows that control and display light only effects controlling the non-image and/or image sources of theinstant invention924; or an operation call for a multi-media presentation controlling both light sources of the instant invention and off-board elements, such as water features andsound systems926. Additional elements, water feature elements and/or sound systems can be incorporated in other operation calls and controlling media through similar “shows” in an analog fashion may also be provided as selectable user inputs, for example switches, as well. Theuser interface50 provides for selection of the operations call by the user, as indicated by the communication line from the user interface to the code segment as an input. Additional elements can be provided for theuser interface50 to be used to provide programming of customized operations calls. These operation calls provide instructions on modifying and manipulating theimage data1100 stored instep920. The storedimage data1100 is processed as display data instep925.
The operations call initiates the extraction and interpretation code segments to access theimage data1100 stored instep920 as noted and processing display data instep930 per the command call. Theimage data1100 is then translated via a further image translation code segments that translate thedisplay data1100 into movement inputs for the machine control code segments andsystem controller200 instep925. The extraction andinterpretation step920 and theprocessing step925 provide display data inputs to the machine commands andsynchronization step930.
The data transmitted from the image data to displaydata translation step925 is interpreted in the machine control andsynchronization step930 and a set of movement commands are output from the machine control step andsynchronization step930. The movement commands allow for movement of the at least one beam/line/projected lightsource steering device150 to commence with the rapid movement of the at least one beam/line/projectedlight source150 to draw or project theimage1000 on thetarget portion1010 of the body ofwater7.
In addition to the movement commands, image color data, depth of field data, and similar image variables can also be stored, retrieved, and translated into machine instructions. The image color data can be used to change the color of the image source light or can be used in multiple projection and non-projection light source devices to change colors for the operations call. Similarly the image data can also include non-projection image or ambient light data. The non-projection image light data can be used in conjunction with an at least one ambient light source, for instance, but certainly not limited to, multiple high brightness LED light sources in pendent lights in the body of water. These HBLED's can be varied in color to provide contrast color to the beam/line/projected light source or simply for visually pleasing effects in conjunction with the image display.
Synchronization instep930 can optionally engage additional light sources, image or non-image alike, or additional components, such as but not limited to water features, bubblers, fountains, waterfalls, laminar jets, water effects, accent lights, pool lights, sound systems, boat controls, pyrotechnic controls, pool controls, or similar components, instep935. Theoptional communications step935 may be included and initiated by the operations call instep920. The synchronization instep930 and the communications with additional components instep935 can include multiple projection elements like those shown inFIGS. 11-13. It can also engage and synchronize multiple elements in a single enclosure.
Instep940, the image projection system component controllers/drivers with code segments for driving same are engaged and machine data alone or in conjunction with additional image data is transmitted from the machine command andsynchronization step930 to drive movement in the underwater imageprojection display system10.
After communicating the machine commands to the drivers in steps930-940 theimage1000 is projected by the underwater imageprojection display system10. Theimage1000 is displayed underwater in the body ofwater7 on atarget portion1010. In an optional further step, a method of correcting theimage1000 is provided to accommodate making changes to the displayed image to fit the display environment as it is displayed instep950.
In this case, inoptional step960, the need for image correction is identified. This can be done automatically or manually or as a combination of both automatic and manual components of the projection system and the user. The method of operation proceeds to step962, in the exemplary embodiment, to obtain input from a user. This is optional, as automatic image correction could be conducted without user input. However, input from the user can be sought throughuser interface controller50. An image correction command is selected instep962 and applied instep966. The image correction may be applied directly or a preview may be provided through the user interface. Examples of the methods of image correction are discussed in further herein below. A final check is made instep968 to determine if additional correction is needed. The affirmative branch loops back tostep960. The negative branch returns proceeds to step970.
Finally, once the image has been displayed and the instructions in the operating call have been executed, the system may be looped to continue the instructions per the operating call or show and continue to send control instructions to the image projection system or it may be returned to a starting state/end state instep970. In this way, the instant invention takes an image as input from a user, stores the image data, retrieves and interprets the image data in response to an operations call, converts the image data to machine inputs, executes the machine inputs and causes an image to be displayed per the instructional data of the operations call. The method of operation can include additional components, such as but not limited to additional imaging and non-imaging lights or water features or sound systems, which can be synchronized per the synchronization step. Additional image data, such as but not limited to color and dimensional data, may also be stored and utilized in the display system. Finally, an optional method of correcting the image for display in the typically non-uniform surfaces of the body of water is also provided.
FIG. 5A shows a flowchart for an exemplary embodiment of the operation of software on an article of manufacture and a method of operating an underwater image display system.FIGS. 5B and 5C show a flowchart of a further exemplary embodiment of the operation of software on an article of manufacture and a method of operating an underwater image display system. For the sake of this description, reference is made to the elements of the MEMS device described herein. The method could be utilized with any of the respective embodiments of the at least oneimage projection device100 in combination with any of the elements or embodiments discussed or described herein. The method of operation is facilitated through software residing on an electronic device with a controller or computer or interactive controller as discussed above. It is also embodied in the code segments stored on computer readable media. The order of the steps is presented for illustrative purposes only, the order of any particular step may be varied or changed. Additionally, for each step inFIGS. 5A-5C a code segment(s) is specified, indicating one or more code segments, as indicated by the parenthetical. The term one or more has been removed from the figures for the sake of brevity, but is intended.
The method of operation starts with a code segment for powering up and testing thesystem2100. The startup step engaging an at least one starting code segment on a controller powering up and testing the Underwater Image Display (UID) system. This can be initiated through a user input or interface or remote command. The system, as shown and discussed, has a power input andpower management system610 and a zero cross detectcircuit620 for power modulation. Theuser interface controller50 provides inputs to thesystem controller200 as shown and a startup command is sent when the user begins operations instep2100.
Acalibration step2125 is shown in the embodiment ofFIG. 5B and may be run to calibrate and center for position on atarget section1010 of the body ofwater7. The calibration step engaging an at least one calibration code segment on a controller calibrating and aligning the imaging system for position on a target section of a body of water. A test point may be projected from the at least oneimage light source102 through the beam steering ormodulation device150 and adjusted through inputs and software on theuser interface controller50 and/or software on thesystem controller200 instep2125 for instance. A further diagnostics step engaging an at least one test and diagnostics code segment on a controller which allows for projection of a test image through the UID system and diagnostic checks of the system components is provided for atstep2150.
In the embodiments of bothFIGS. 5A and 5B, the step of inputting an image step engaging an at least one input code segment on a controller which produces input image data which is processed by an at least one image data deconstruction and compilation code segment on a controller and the resulting image controller readable data is stored or transmitted or stored and transmitted to/in the UID system. InFIG. 5A, this is done together with an image display operational call. InFIG. 5B, as described below, a further step is provided through a user interface. An image1020 is input to the user interface, in a manner as substantially described herein relation to the device embodiments, to produceinput image data1100. This can be done in any manner indicated, in this instance it is provided to theuser interface controller50 and stored there and communicated through the RS-485 communications line to thesystem controller200 inmethod step2250 as described herein below. Theimage data1100 is deconstructed or translated and compiled instep2300 and the resulting data stored in thememory210 of thesystem controller200.
InFIG. 5B, the step of inputting user data through an at least one user interface code segment(s) on a controller and a user input device to produce additional data is provided2250. This can include but is not limited to selection of an operational call from the user for UID system through the user interface. This additional data being stored or transmitted or stored and transmitted to/in the UID system. The additional data, including the selection of an operational call, instep2250 can be made for instance, but is certainly not limited to being made, from theuser interface50 and this data can likewise transmitted to thesystem controller200.
Instep2300, a translation and/or compilation step engaging an at least one translation and/or compilation code segment on a controller which acts to translate and/or compile the stored image data through a set of instructions corresponding with the operations call for the type of display into machine readable instructions for the UID system components and any additional systems as needed is provided. The at least one code segments of software on thesystem controller200 translate the storedimage data1100 through a set of instructions corresponding with the operational call selected. As noted above, operational calls can include for example, but are certainly not limited to, projection of a moving image with image and non-image light control, projection of static images individually or in series with image and non-image light control, projection of colors or shapes with image and non-image light control, control of just non-image lights, projection and control of the image and non-image light control in conjunction with additional water features and/or a sound system, or similar structured shows or output.
Instep2400 of the exemplary method shown, a transmission and command step engaging an at least one transmission and control segment on a controller transmits the machine readable command instructions to the drivers of the components of the UID system based on the operations call to activate the UID and any additional controlled elements depending on the type of operations call and what controls are needed from the data. In a non-limiting example relating to the exemplary embodiment of the devices of the instant invention, thesystem controller200 would send out command instructions to the drivers based on the selected operations call made at theuser interface controller50 to activate theimage display100 and any additional controlled elements, as shown in the furtherFIGS. 4 and 14. For example, but certainly not limited to, depending on the type of call and what controls are needed, as shown inFIG. 6 instructions from thesystem controller200 can then flow to thelaser driver670 theMEMs device driver650, theLED driver630 and any additional light sources, water features, and/or sound system and the respective controllers through the RS485 communications link or a wireless link. The at least oneimage light source100, here for instance lasers672,674,676 are engaged and the at least one beam/line/projected lightsource steering device150, here aMEMs device660 withoptics690,695, steers the beam to produce the desire images with or without motion.
An additional input step is provided inFIG. 5C. The step of inputting additional command information by engaging additional command code segment(s) on a controller is provided at2450. The additional command information allowing for variations and modulation of, for example but certainly not limited to, color, depth (e.g. 2D+depth or 3D), or other variables associated with the image data and for control of multiple source points, e.g. multiple image light sources and ambient light colors and sources or variables involved in music presentations or for a pre-programmed timer, or other variables to switch images or synchronize image or projection change cues. These can be communicated or stored numeric variables, for instance, the values can be computed from the red-green-blue (RGB) values or through other chromatic representations of the color space are possible, for example: monochrome; hue-saturation value (HSV); YUV, which is a color model used for encoding video; cyan-magenta-yellow (CYN); and cyan-magenta-yellow-black (CYNb.). These in turn can include additional data points based on any available compression or data representations for variables in conventional imaging as additional command information. The additional command information can also be interpreted and communicated by thesystem controller200 to the respective components. The additional data may include variables involved in music presentations or for a pre-programmed timer, or other variables to switch images or synchronize image or projection change cues instep2300.
Per theimage data1100 and the operations call selection, made instep2200 or2250, an at least one of the image projection element, the non-image projection lighting, and the additional water features and/or sound system would be operated to display the resultingimage1000 or desired color effects with coordinated outputs for the additional water features and/or sound system for the selected operations call.Step2500 provides the step of projecting a resulting image through an at least one code segment on a controller controlling the hardware of the UID system and the image data, image controller data, additional data, and any additional selected variables or adjustments based on user input or sensors resulting in the projection of an image/graphic, either static or in motion, or desired color effects with coordinated outputs for the additional water features and/or sound system for the selected operations call.
As noted above with respect to the flow chart ofFIG. 4, an adjustment method may be provided for correcting the image output based on variables in the projection surface ortarget section1010 of the body ofwater7 or variables in the water itself. Themethod2550 is substantially the same as that described above in relation toFIG. 4 and as further described therein. Anadjustment step2550 engages an at least one adjustment code segment on a controller for correcting the image output based on variables in the projection surface or target section of the body of water or variables in the water itself through the user interface or automatically through sensors. The adjustment may also provide, but is not limited to, adjustments through a user controller to manipulate the resulting image to compensate for display area/portion contours and other variables in the body of water.
In a final step, a reset or loop step engaging an at least one reset/loop code segment(s) on a controller allows the user to adjust, reset, or loop the operations call to allow for changes in sequencing of an image(s) or to add multiple operations calls in the display or adjust the start or ending of a projection or to allow for a reset and restart of the image projection system via user input. The software on the user interface can for instance allow for adjustment of the operations call to allow for changes in sequencing of an image or the start or ending of a projection or to allow for a reset and restart of the image projection system instep2600.
Thus, in this instance, the additional exemplary embodiment receives a user inputted image1020, communicates it to the underwater imageprojection display system10 asimage data1100 and throughuser interface50 selections of an operating call or show are made by the user to provide the desired projection or display of moving or static images or a projected light show with non-projected light control alone or together with additional water features and/or a sound system display.
FIG. 6 shows a functional plan view of the system controller and the system components in an exemplary embodiment. Thesystem controller200 is provided and is coupled to and communicates with the system components. A 12VAC or 120VAC power input is provided to apower management subsystem610. The power management system is in communication with a zero cross detectcircuit620 to provide for modulation detection in the case of using such modulation as a control input. Thesystem controller200 is coupled to onboard memory storage210. Thesystem controller200 is also provided with on board wired40 orwireless communications45, in this instance an RS485 input. This provides communication with auser interface controller50 or amaster controller80, as shown for instance inFIGS. 10-12 below, or with additional elements, such as additional water elements or sound systems, or the like.
Thesystem controller200 can also control the at least one ambient or non-image projectionlight source104. In the instant embodiment this is provided through anLED driver630 driving three high output LEDs, one red632, one green633, one blue634. The ambient light is controlled both as a function of the programmed image display software, using it in conjunction with the image projection device, and as a component of simply controlling the ambient light exclusive of the image display device. An at least one light level sensor48 is coupled to thecontroller200. The light level sensor48 allows for automatic sensing of an ambient light level in the body ofwater7 and can be a single sensor or an array of sensors and can include sensing from other systems outside of the lighting systems of the body ofwater7. The output of the at least one light sensor48 can also be incorporated in the operation of the underwater imageprojection display system10. Two non-limiting examples of its use include establishing levels for theambient lighting104 in the imageprojection display system10 and for turning the imageprojection display system10 and/or additional lighting on and off with the sunset or sunrise.
In the exemplary embodiment shown, thesystem controller200 is coupled to and communicates with animage projection device10 and theimage projection element100. In this exemplary embodiment, theimage projection element100 is the set of lasers; the three discrete color laser lights are the imagelight source102 in theimage display device100. In this case thelaser driver670 controls a red laser source672, a green laser source674, and a blue laser light source676. These are then passed to abeam combiner680 which sends the laser light sources672,674,676 through the beam collimator to a convergingoptic690. The convergingoptic690 then passes the emitted light to a micro-electric machine (MEMs)Device660. TheMEMs device660 comprises a grid of nano or pico mirror elements controlled by theMEMs driver650 to steer an image light source, in this instance the combined laser beam or a similar micro-electrical machine device. TheMEMs driver650 is coupled through a digital toanalog converter640 to thesystem controller200. After reflecting the light from theMEMs device660, the reflected light begins to paint animage1000 by passing through adivergence optic695 to the targetedportion1010 of the body ofwater7.
An additional watchdog circuit is provided to add measure of safety in operating the laser of the exemplary embodiment. Asensor710 measures optical intensity of the emitted light and communicates to thewatchdog circuit700. Thewatchdog circuit700 communicates with thelaser driver670 and theMEMs device660 communicates its operational status with thewatchdog circuit700. If either thesensor710 input or theMEMs device660 input is abnormal, indicating a malfunction in the laser or the steering device, thewatchdog circuit700 shuts down thelaser driver670 through its coupling720 which shuts down the laser light sources672,674,676 to protect any user from having the laser light freeze in place and potentially cause harm.
Thus, thesystem controller200 in the exemplary embodiment shown provides control of ambient or non-image lighting in conjunction with an image light source, in the embodiment shown as several lasers, or separately without the image light source to provide a pleasing display of static or moving images underwater in a body ofwater7.
FIG. 7A shows a front view of a further exemplary embodiment of the instant invention utilizing a MEMs device. As noted above, a MEMS device may be utilized as the at least one projected light source steering ormodulation device150 as shown. The at least one image projection element ordevice100 and the projectedlight source102 are combined in theMEMs device300 and enclosed therein with a HS on the exterior to provide for heat dissipation. TheMEMS device300 is coupled to anelectronics section200 internally with anadditional electronics section220 controlling theMEMS device300. As best seen inFIG. 6, theprojection element100 has an at least one image projection or projectinglight source102, here three diode lasers in red372, blue374, and green376 respectively. An at least one additional or further optic orsteering device3220 is also provided inFIG. 7A. The at least one further or additional optic is shows as an additional mirror used to steer the projectedimage1000. The is in addition to the at least one beam/line/projected light source steering ormodulation device150 that is coupled to anelectronics section200 within theMEMs device300 as shown inFIGS. 6 and 7A. This section can include for instance the controller for the MEMS device, software for driving same, controllers for the laser drivers, and similar software and/or hardware. Theelectronics section220 is a component of thesystems controller200 and provides localized control of the MEMS device as a part of thesystem controller200. A series of ambient or non-projectionlight elements104 are also provided around the system.
FIGS. 7B and 7C show a front and a cross-sectional view, respectively, of a further exemplary embodiment of the instant invention using a MEMs device and a further image steering mechanism. In the cross-sectional view, theimage projection system10 is again a MEMs device similar to the embodiment shown inFIG. 7A, though any of the previously mentioned image sources/projected image steering systems could be employed. The embodiment had additional LED ambientlight sources104, alens body90, a frame orenclosure30 In this instance the ambientlight sources104 move with theenclosure30, however, these light sources may also be located on a stationary portion of the enclosure or otherwise isolated from movement
In this exemplary embodiment, theenclosure30 includes a tilting device with two axis for rotation of the device. The two axis being represented by two threaded rods coupled to motors acting as a furtherimage steering element3220. The tworod elements36,38 are used as the at least one furtherimage steering device3220 allows for translation of the entire underwaterimage display system10 to be moved and thereby move the projectedimage1000. This would function at the point of installation, where the tilting mechanism would provide both pitch and yaw control of the underwater image display system, as seen inFIG. 7C. The pitch or yaw motions are provided by an at least one motor, here twomotors151,153 coupled to therod elements36,38 which can move independently or in coordination one another. A third axis of motion could easily be provided, a twisting or similar motion about the outside perimeter of the device. Therod elements36,38 are engaged between the bottom of the enclosure and afurther plate39 with aflexible mount33 there between. By moving therod elements36,38 the enclosure may tilt and in the process move the image projection device, here aMEMs device300.
Additional mechanisms may also be used to move thefull enclosure30 of the instant invention to provide motion in the projected image. For instance, a gimble in gibmle setup would permit easily controlled movements or ball joint/socket or similar devices. These other devices may also utilize a single motor and are fully embraced herein; similarly the further image steering device320 may also encompass a single faceted ball moving in at least two axis or a single drive motor with a more complicated transmission driving a similar gimble. A printedcircuit board201 is provided and theelectronics section220,image projection device100, here again a MEMs based device acting both as the projected light source and as the first, and an ambientlight source104, a projected light source steering device are all shown in the embodiment ofFIG. 7C.
FIG. 8 shows a further exemplary embodiment. The embodiment ofFIG. 8 produces animage1000 utilizing aprojection light source102 and at least onesteering device150 in one unit, here a Digital Light Processing (DLP) image projection source andimage steering device400. The embodiment uses a similar circular shaped body or enclosure orframe member30 as that seen inFIGS. 7A and 7B. Coupled to theframe member30 is an optional at least one ambient light, here a set of ambient HBLED lights,104. Anelectronics package220 enables communication with the at least one image projection device, here theDLP projector400. A furtherimage steering mechanism3220 is provided which provides the ability to move theimage1000 around the pool for optimal viewing on the typically irregular pool sides and/orbottom2,5. In operation, theDLP Projection system400 utilizes an at least oneimage light source104, in this instance multiple HBLEDs, which are projected onto animage steering device150, in this instance a micro mirror array on a silicon chip, a controller in the DLP projection system acts as thesystems controller200. The at least one image light source, the multiple HBLEDs, project the image through the system. The operation of a DLP based LED projection mechanism is known and the specific details are not covered here. However, as a brief overview summary, the incident light is reflected off the micro mirror array or to an image dump, e.g. not reflected as a part of the image, resulting in a full color image projection.
The at least one furtherimage steering device3220 is shown here as a two axis movable reflective element or mirror. Further embodiments may utilize, for instance but certainly not limited to, an at least one single axis mirror, a galvanometer, an electric pitch motor, an electric yaw motor or the like. The at least one further image steering device may effect steering through adjustment of an image reflective element or may simply move the entire light body or frame, for instance having a further frame member coupled to theframe member30 as depicted inFIG. 7C. In this instance the further frame member, like that noted inFIG. 7C above, would have a similar DLP projection element stationary on the printedcircuit board200 and the printedcircuit board200 would be moved with theframe member30. In addition, in the exemplary embodiment shown, the furtherimage steering device3220 alone or in conjunction with theimage steering device150 within the image projection device, here theDLP projection device400 provides the ability to adjust at least one image parameter to an at least one pool variable. For instance, the imagesource steering device150, here the micro mirror array within theDLP projection device400, in conjunction with the at least one furtherimage steering device3220 can provide skew controls to accommodate the non-uniform surface of thepool bottom2 orsides5, as shown inFIGS. 1-2 and10-13.
In a particular exemplary embodiment, the system controller (not shown) within theDLP projection device400 further comprises an additional controller orelectronics section220 with software that is “trained” or setup by a user to provide skew movement values for test projection image(s)1000. The additional controller communicates with the imagesource steering device150 and the furtherimage steering device3220. Based on this input, the software through adjustment with either the imagesource steering device150 alone or in combination with the furtherimage steering device3220 provides a programmed variance for moving the projected image around the pool while adjusting the parameters of the resulting image to maintain the viewed image throughout the non-uniform projection surfaces of the water feature. This allows for full, simulated motion of an image throughout the pool with minimal distortion. The result is near photorealistic images and motion throughout the pool of an animated image.
FIG. 9 shows a further exemplary embodiment of the instant invention utilizing a DLP in a small form factor. Similar toFIG. 8, the embodiment ofFIG. 9 produces animage1000 utilizing aprojection light source102 and at least onesteering device150 in one unit, again here a DLPimage projection source400, though other technologies providing the combination of the functions of projecting the image light source and the steering function in a compact unit could also be utilized in the exemplary embodiment shown inFIG. 9. These include but are certainly not limited to LCOS, LCD, DLP, hybrid and laser projection systems, spatial light modulation systems, and the like. The embodiment is depicted has a smaller frame orbody member30 without any of the at least oneambient lighting elements104 shown inFIG. 8. The embodiment ofFIG. 9 utilizes a furtherimage steering device3220, here shown as a multi-axis mirror. The combined in the DLPimage projection device400 is used, as above. The smaller footprint afforded the embodiment ofFIG. 9 would fits smaller form factors in various water features and shows the instant invention without ambient lighting elements.
FIG. 10 shows a still further embodiment of the instant invention having multiple non-imaging light sources. The further exemplary embodiment provides for multiple ambientlight sources104 indicated in positions throughout the body ofwater7. Here the body ofwater7 is a pool with several ambient or non-image projectinglight sources104 embedded in its sides and floor. The non-image projectinglight sources104 are coupled to and communicate throughwireless connections45 with amaster controller80 in theuser interface controller50. Themaster controller80 anduser interface controller50 are, in this instance, located outside the body ofwater7 and coupled to the underwater imageprojection display system10 via awired communications line40. In operation the further exemplary embodiment shown synchronizes the non-image projectinglight sources104 in conjunction with the imagelight source102 of the underwater imageprojection display system10. The underwater imageprojection display system10 utilizes components and methods similar to those herein described, including the synchronization of the non-image projectinglight sources104 in conjunction with the operational calls and display ofimage1000 as shown.
FIG. 11 shows a plan view of yet a further exemplary embodiment of the instant invention having multiple image projection devices and multiple non-imaging light sources. In the exemplary embodiment shown, amaster control system80 is provided separate and apart from theuser interface50, which is coupled to and communicates with themaster control80 through awireless connection45. Themaster controller80 communicates through wiredconnections40 to multiple non-image projectionlight sources104 throughout the body ofwater7, here a public fountain display. In addition, multipleimage projection systems10 are provided, each being similar to the previously disclosedimage projection systems10. Themultiple projection systems10 are utilized to display images atmultiple target sections1010,1011 of the body ofwater7. Theseimages1000,1001 may be synchronized, may be overlapped, or combined in there displays in any manner desired or may be operated independently of one another. Themaster controller80 communicates and synchronizes the display in the manner substantially described herein above, utilizing theuser interface50 to select operations calls for theimage projection systems10 and the non-imageprojection lighting systems104, either together or alone.
FIG. 12 shows a plan view of a still further exemplary embodiment of the instant invention. The exemplary embodiment shown is similar to that ofFIG. 12 above, having amaster controller80 inwireless communication45 with auser interface controller50. Themaster controller80 is also coupled to and controlling at least two underwater imageprojection display systems10 and multiple non-image projectionlight sources104 through RS-485wired connections40. In addition to the multiple underwater imageprojection display systems10 and the non-image projectionlight sources104, themaster controller80 is in communication with and synchronizes withadditional water elements1400 andsound systems1500. The operations calls made in this embodiment include controls for and synchronization of theimage projecting systems10 not only with the non-image projectionlight sources104, but also with theadditional water elements1400 andsound system1500. This can include additional data representing image change cues synchronized to the music and change cues for the output and control of the additional features. As noted, non-limiting examples of these additional features include but certainly are not limited to water features, bubblers, fountains, waterfalls, laminar jets, water effects, accent lights, pool lights, sound systems, boat controls, pyrotechnic controls, pool controls, or similar components.
FIG. 13 shows a flowchart of an exemplary embodiment of a method of adjusting an image projected underwater to a point in a body of water to conform to non-uniform display surfaces in the body of water. As part of one of the aforementioned exemplary embodiments or alternatively as a part of another underwater projection system, a method for adjusting or correcting the image display is provided. In a first step an underwater image projection system is engaged3100. The image is displayed from a projector in the body of water on a target section of the body of water. The target section may be non-uniform or the water may interfere with the projection in such a way that it distorts the image. In asecond step3200, a correction is made to the projection of the image. In this instance the correction can be one of any number of mechanisms to adjust the image. This includes skew adjustment, pitch adjustment, pincusioning, aliasing and anti-aliasing, and similar digital adjustment techniques. This may also include adjustment of color, saturation, contrast, or other image qualities or characteristics commensurate with a projected image. These options may be presented to the user via a user interface.
The selected correction is then applied to the image instep3300. The application of the image correction may be through a user input or user interface controller alone or together with instructions from a system controller. The correction may be applied in a preview and shown in the user interface prior to the application to the projected image. After application of the correction, the method may be exited or looped to review or revise the effect of the correction and if needed a further or additional correction may be made, as seen instep3400.
The embodiments and examples discussed herein are non-limiting examples. The invention is described in detail with respect to preferred or exemplary embodiments, and it will now be apparent from the foregoing to those skilled in the art that changes and modifications may be made without departing from the invention in its broader aspects, and the invention, therefore, as defined in the claims is intended to cover all such changes and modifications as fall within the true spirit of the invention.