CROSS-REFERENCE TO RELATED APPLICATIONSThe present application is related to and claims the benefit of the earliest available effective filing date(s) from the following listed application(s) (the “Related Applications”) (e.g., claims earliest available priority dates for other than provisional patent applications or claims benefits under 35 USC §119(e) for provisional patent applications, for any and all parent, grandparent, great-grandparent, etc. applications of the Related Application(s)).
RELATED APPLICATIONSFor purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/214,422, entitled SYSTEMS AND DEVICES, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 17 Jun. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/217,118, entitled MOTION RESPONSIVE DEVICES AND SYSTEMS, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 30 Jun. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/217,116, entitled SYSTEMS AND METHODS FOR PROJECTING, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 30 Jun. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/217,115, entitled SYSTEMS AND METHODS FOR TRANSMITTING INFORMATION ASSOCIATED WITH PROJECTING, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 30 Jun. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/217,123, entitled SYSTEMS AND METHODS FOR RECEIVING INFORMATION ASSOCIATED WITH PROJECTING, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 30 Jun. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/217,135, entitled SYSTEMS AND METHODS FOR PROJECTING IN RESPONSE TO POSITION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 30 Jun. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/217,117, entitled SYSTEMS AND METHODS FOR PROJECTING IN RESPONSE TO CONFORMATION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 30 Jun. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/218,269, entitled SYSTEMS AND METHODS FOR PROJECTING IN RESPONSE TO POSITION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 11 Jul. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/218,266, entitled SYSTEMS AND METHODS FOR PROJECTING IN RESPONSE TO POSITION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 11 Jul. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/218,267, entitled SYSTEMS AND METHODS FOR PROJECTING IN RESPONSE TO CONFORMATION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 11 Jul. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/218,268, entitled SYSTEMS AND METHODS FOR PROJECTING IN RESPONSE TO CONFORMATION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 11 Jul. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/220,906, entitled METHODS AND SYSTEMS FOR RECEIVING AND TRANSMITTING SIGNALS ASSOCIATED WITH PROJECTION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 28 Jul. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. UNKNOWN, entitled PROJECTION IN RESPONSE TO CONFORMATION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 22 Aug. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. UNKNOWN, entitled METHODS AND SYSTEMS FOR PROJECTING IN RESPONSE TO POSITION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 22 Aug. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. UNKNOWN, entitled METHODS AND SYSTEMS FOR PROJECTING IN RESPONSE TO POSITION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 22 Aug. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. UNKNOWN, entitled METHODS AND SYSTEMS FOR PROJECTING IN RESPONSE TO CONFORMATION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 22 Aug. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. UNKNOWN, entitled METHODS AND SYSTEMS FOR PROJECTING IN RESPONSE TO CONFORMATION, naming Edward K. Y. Jung, Eric C. Leuthardt, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, John D. Rinaldo, Jr., and Lowell L. Wood, Jr. as inventors, filed 22 Aug. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
The United States Patent Office (USPTO) has published a notice to the effect that the USPTO's computer programs require that patent applicants reference both a serial number and indicate whether an application is a continuation or continuation-in-part. Stephen G. Kunin, Benefit of Prior-Filed Application, USPTO Official Gazette Mar. 18, 2003, available at http://www.uspto.gov/web/offices/com/sol/og/2003/week11/patbene.htm. The present Applicant Entity (hereinafter “Applicant”) has provided above a specific reference to the application(s) from which priority is being claimed as recited by statute. Applicant understands that the statute is unambiguous in its specific reference language and does not require either a serial number or any characterization, such as “continuation” or “continuation-in-part,” for claiming priority to U.S. patent applications. Notwithstanding the foregoing, Applicant understands that the USPTO's computer programs have certain data entry requirements, and hence Applicant is designating the present application as a continuation-in-part of its parent applications as set forth above, but expressly points out that such designations are not to be construed in any way as any type of commentary and/or admission as to whether or not the present application contains any new matter in addition to the matter of its parent application(s).
All subject matter of the Related Applications and of any and all parent, grandparent, great-grandparent, etc. applications of the Related Applications is incorporated herein by reference to the extent such subject matter is not inconsistent herewith.
TECHNICAL FIELDThe present disclosure relates to systems and methods that are related to projection.
SUMMARYIn one aspect, a method includes but is not limited to obtaining information related to one or more positions associated with one or more projection surfaces and accessing content in response to the information related to one or more positions associated with one or more projection surfaces. The method may optionally include projecting in response to the accessing content. The method may optionally include coordinating one or more positions associated with the one or more projection surfaces with one or more commands. The method may optionally include projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more commands. In addition to the foregoing, other aspects are described in the claims, drawings, and text forming a part of the present disclosure.
In one aspect, a system includes but is not limited to circuitry for obtaining information related to one or more positions associated with one or more projection surfaces and circuitry for accessing content in response to the circuitry for obtaining information related to one or more positions associated with one or more projection surfaces. The system may optionally include circuitry for projecting in response to the circuitry for accessing content. The system may optionally include circuitry for coordinating one or more positions associated with the one or more projection surfaces with one or more commands. The system may optionally include circuitry for projecting in response to the circuitry for coordinating one or more positions associated with the one or more projection surfaces with one or more commands. In addition to the foregoing, other system aspects are described in the claims, drawings, and text forming a part of the present disclosure.
In one aspect, a system includes but is not limited to means for obtaining information related to one or more positions associated with one or more projection surfaces and means for accessing content in response to the means for obtaining information related to one or more positions associated with one or more projection surfaces. The system may optionally include means for projecting in response to the means for accessing content. The system may optionally include means for coordinating one or more positions associated with the one or more projection surfaces with one or more commands. The system may optionally include means for projecting in response to the means for coordinating one or more positions associated with the one or more projection surfaces with one or more commands. In addition to the foregoing, other system aspects are described in the claims, drawings, and text forming a part of the present disclosure.
In one aspect, a system includes but is not limited to a signal-bearing medium bearing one or more instructions for obtaining information related to one or more positions associated with one or more projection surfaces and one or more instructions for accessing content in response to the information related to one or more positions associated with one or more projection surfaces. The system may optionally include one or more instructions for projecting in response to accessing content. The system may optionally include one or more instructions for coordinating one or more positions associated with the one or more projection surfaces with one or more commands. The system may optionally include one or more instructions for projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more commands. In addition to the foregoing, other system aspects are described in the claims, drawings, and text forming a part of the present disclosure.
In one or more various aspects, means include but are not limited to circuitry and/or programming for effecting the herein referenced functional aspects; the circuitry and/or programming can be virtually any combination of hardware, software, and/or firmware configured to effect the herein referenced functional aspects depending upon the design choices of the system designer. In addition to the foregoing, other system aspects means are described in the claims, drawings, and/or text forming a part of the present disclosure.
In one or more various aspects, related systems include but are not limited to circuitry and/or programming for effecting the herein-referenced method aspects; the circuitry and/or programming can be virtually any combination of hardware, software, and/or firmware configured to effect the herein referenced method aspects depending upon the design choices of the system designer. In addition to the foregoing, other system aspects are described in the claims, drawings, and/or text forming a part of the present application.
The foregoing is a summary and thus may contain simplifications, generalizations, inclusions, and/or omissions of detail; consequently, those skilled in the art will appreciate that the summary is illustrative only and is NOT intended to be in any way limiting. Other aspects, features, and advantages of the devices and/or processes and/or other subject matter described herein will become apparent in the teachings set forth herein.
BRIEF DESCRIPTION OF THE FIGURESFIG. 1 illustrates anexample system100 in which embodiments may be implemented.
FIG. 1A illustrates example components that may be implemented withinexample system100.
FIG. 1B illustrates example components that may be implemented withinexample system100.
FIG. 1C illustrates example components that may be implemented withinexample system100.
FIG. 2 illustrates anoperational flow200 representing example operations related to obtaining information related to one or more positions associated with one or more projection surfaces and accessing content in response to the information related to one or more positions associated with one or more projection surfaces.
FIG. 3 illustrates alternative embodiments of the example operation flow ofFIG. 2.
FIG. 4 illustrates alternative embodiments of the example operation flow ofFIG. 2.
FIG. 5 illustrates alternative embodiments of the example operation flow ofFIG. 2.
FIG. 6 illustrates alternative embodiments of the example operation flow ofFIG. 2.
FIG. 7 illustrates alternative embodiments of the example operation flow ofFIG. 2.
FIG. 8 illustrates alternative embodiments of the example operation flow ofFIG. 2.
FIG. 9 illustrates alternative embodiments of the example operation flow ofFIG. 2.
FIG. 10 illustrates anoperational flow1000 representing example operations related to obtaining information related to one or more positions associated with one or more projection surfaces, accessing content in response to the information related to one or more positions associated with one or more projection surfaces, and projecting in response to the accessing content.
FIG. 11 illustrates alternative embodiments of the example operation flow ofFIG. 10.
FIG. 12 illustrates anoperational flow1200 representing example operations related to obtaining information related to one or more positions associated with one or more projection surfaces, accessing content in response to the information related to one or more positions associated with one or more projection surfaces, and coordinating one or more positions associated with the one or more projection surfaces with one or more commands.
FIG. 13 illustrates alternative embodiments of the example operation flow ofFIG. 12.
FIG. 14 illustrates alternative embodiments of the example operation flow ofFIG. 12.
FIG. 15 illustrates anoperational flow1500 representing example operations related to obtaining information related to one or more positions associated with one or more projection surfaces, accessing content in response to the information related to one or more positions associated with one or more projection surfaces, coordinating one or more positions associated with the one or more projection surfaces with one or more commands, and projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more commands.
FIG. 16 illustrates alternative embodiments of the example operation flow ofFIG. 15.
FIG. 17 illustrates alternative embodiments of the example operation flow ofFIG. 15.
FIG. 18 illustrates a partial view of asystem1800 that includes a computer program for executing a computer process on a computing device.
FIG. 19 illustrates a partial view of asystem1900 that includes a computer program for executing a computer process on a computing device.
FIG. 20 illustrates a partial view of asystem2000 that includes a computer program for executing a computer process on a computing device.
FIG. 21 illustrates a partial view of asystem2100 that includes a computer program for executing a computer process on a computing device.
DETAILED DESCRIPTIONIn the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
FIG. 1 illustrates anexample system100 in which embodiments may be implemented. In some embodiments,system100 may include one ormore devices105. In some embodiments,system100 may include one or more housings110. In some embodiments,system100 may includesystem memory140. In some embodiments,system100 may include one ormore projectors130. In some embodiments,system100 may include one or moreprojector control units120. In some embodiments,system100 may include one or moremotion response modules190. In some embodiments,system100 may include one or moresensor control units170. In some embodiments,system100 may include one ormore sensors150. In some embodiments,system100 may include one ormore interface modules180. In some embodiments,system100 may include one or more projection surfaces200. In some embodiments,system100 may include one or more user interfaces300. In some embodiments,system100 may include one or moreexternal devices400. In some embodiments,system100 may includeexternal memory500. In some embodiments,system100 may provide for user600 interaction. In some embodiments,system100 may include two ormore projectors130 that project in a coordinated manner. For example, in some embodiments, two ormore projectors130 may project the same content such that the projections are registered together to create a continuous projection.
DeviceA system may include one ormore devices105. Adevice105 may be configured to have numerous conformations. In some embodiments, adevice105 may be configured as a hand held device. For example, in some embodiments, adevice105 may be configured as a computer mouse. In some embodiments, adevice105 may be configured as a hand heldprojector130. In some embodiments, adevice105 may be configured as a hand heldprojector130 and laser pointer. In some embodiments, adevice105 may be configured as amountable device105. For example, in some embodiments, adevice105 may be configured as adevice105 that may be mounted to a ceiling. In some embodiments, adevice105 may be configured as a ceiling mounteddevice105 that may be configured to project content onto one or more portions of one or more substantially vertical surfaces. In some embodiments, adevice105 may be configured as a ceiling mounteddevice105 that may be configured to project content onto one or more portions of one or more substantially horizontal surfaces. In some embodiments, adevice105 may be configured as a ceiling mounteddevice105 that may be configured to project content onto one or more portions of one or more substantially vertical surfaces and onto one or more portions of one or more substantially horizontal surfaces. In some embodiments, adevice105 may be configured to project content onto one or more portions of one or more tabletops. For example, in some embodiments, adevice105 may be mounted onto a wall and configured to project content onto one or more tabletops. In some embodiments, adevice105 may be mounted and/or positioned onto a desk and configured to project content onto one or more desktops. In some embodiments, adevice105 may be mounted to or otherwise contained within another system, such as a desktop or mobile computer, PDA, cellular phone,camera163, video player, or other system, for the display of content associated with that system. Accordingly, adevice105 may be configured in numerous ways to project content onto numerous types of projection surfaces200.
In some embodiments, adevice105 may be configured to project in response to motion imparted to thedevice105. In some embodiments, adevice105 may be configured to project content in manner that is dependent upon one or more substantially specific motions that are imparted to thedevice105. For example, in some embodiments, adevice105 may be configured to project content contained on pages of a book in a manner that is motion dependent. Accordingly, in some embodiments, adevice105 may be configured to project content contained on the next page in a series upon rotation of thedevice105 in a clockwise direction. In some embodiments, adevice105 may be configured to project content contained on the preceding page in a series upon rotation of thedevice105 in a counterclockwise direction. In some embodiments, adevice105 may be configured to project content on the next page in a series upon being moved to the left from a starting position and then moved substantially back to the starting position. In some embodiments, thedevice105 may be configured to project content on the preceding page in a series upon being moved to the right from a starting position and then moved substantially back to the starting position. In some embodiments, adevice105 may select content to be projected in response to motion imparted to thedevice105. For example, in some embodiments, adevice105 may be configured to project content associated with a newspaper when thedevice105 is positioned in a first orientation and be configured to project content associated with a news magazine when positioned in a second orientation. In some embodiments, adevice105 may be configured to correlate substantially specific motions with projection commands to select content in a motion dependent manner. In some embodiments, adevice105 may be configured to correlate substantially specific motions with projection commands to project content in a motion dependent manner. In some embodiments, adevice105 may be configured to correlate substantially specific motions with projection commands to select and project content in a motion dependent manner.
In some embodiments, adevice105 may be configured to project content in a manner that is dependent upon a person who is associated with thedevice105. For example, in some embodiments, adevice105 may be configured to project children's content if used by a child. In some embodiments, adevice105 may be configured to project the statistics associated with various sports teams when associated with a first person and configured to project stock quotes when associated with a second person. Accordingly, adevice105 may be configured to project content that is selected in accordance with specific persons or classes of persons.
HousingSystem100 may include one ormore devices105 that include one or more housings110. In some embodiments, a housing110 may be configured to include one ormore projectors130, one or moreprojector control units120, one or moremotion response modules190, one or moresensor control units170, one ormore sensors150, one ormore interface modules180, or substantially any combination thereof. In some embodiments, a housing110 may be configured for use in ahandheld device105. In some embodiments, a housing110 may be configured for use in amountable device105. Accordingly, a housing110 may be configured to have numerous conformations. A housing110 may be constructed from numerous types of materials and combinations of materials. Examples of such materials include, but are not limited to, plastics, metals, papers, ceramics, and the like. In some embodiments, a housing110 may include electrical connections to provide for operable association of components associated with the housing110. In some embodiments, a housing110 may include optical connections to provide for operable association of components associated with the housing110.
MemorySystem100 may include numerous types ofsystem memory140. Examples ofsystem memory140 include, but are not limited to, flash memory, random access memory, read-only memory, hard drives, optical storage,external memory500, and the like. In some embodiments, thesystem memory140 may be dedicated for access from one or more individual components (e.g., one or more processors) contained withinsystem100. In some embodiments, thesystem memory140 may be included within one ormore devices105. In some embodiments, thesystem memory140 may be included within one ormore devices105 and may be dedicated for access from one or more individual components (e.g., one or more processors) included within thedevice105. In some embodiments, thesystem memory140 that is included within thedevice105 may be configured for system wide access.System memory140 may be configured in numerous ways. Examples of such configurations include, but are not limited to,projector processor memory132,projector memory134,control processor memory122,control memory124,response processor memory192,response memory194,sensor processor memory172,sensor memory176, and substantially any combination thereof.
ProjectorSystem100 may include one ormore projectors130. In some embodiments, aprojector130 may be operably associated with one or moreprojector control units120. In some embodiments, aprojector130 may be operably associated with one or moremotion response modules190. In some embodiments, aprojector130 may be operably associated with one ormore interface modules180. In some embodiments, aprojector130 may be operably associated with one ormore sensors150. In some embodiments, aprojector130 may be operably associated with one or moresensor control units170. In some embodiments, aprojector130 may be operably associated withsystem memory140. In some embodiments, aprojector130 may be operably associated with one ormore projector processors131. In some embodiments, aprojector130 may be operably associated withprojector processor memory132. In some embodiments, aprojector130 may be operably associated with one ormore projector instructions133. In some embodiments, aprojector130 may be operably associated withprojector memory134. In some embodiments, aprojector130 may be operably associated withprojector memory instructions135. In some embodiments, aprojector130 may be operably associated with one or moreprojector calibration images136. In some embodiments, aprojector130 may be operably associated with one or morecontrol motion patterns127. In some embodiments, aprojector130 may be operably associated with one or more user interfaces300. In some embodiments, aprojector130 may be operably associated with one or moreexternal devices400. In some embodiments, aprojector130 may be operably associated withexternal memory500. In some embodiments, aprojector130 may be operably associated with one or more housings110. In some embodiments, aprojector130 may be an image stabilizedprojector130.
System100 may include numerous types of image stabilizedprojectors130. In some embodiments, aprojector130 may include inertia andyaw rate sensors161 that detect motion and provide for adjustment of projected content to compensate for the detected motion. In some embodiments, aprojector130 may include an optoelectronic inclination sensor and an optical position displacement sensor to provide for stabilized projection (e.g., U.S. Published Patent Application No.: 2003/0038927). In some embodiments, aprojector130 may include an optoelectronic inclination sensor, an optical position sensitive detector, and a piezoelectric accelerometer that provide for stabilized projection (e.g., U.S. Published Patent Application No.: 2003/0038928). Image stabilizedprojectors130 have been described (e.g., U.S. Pat. No. 7,284,866; U.S. Published Patent Application Nos.: 20050280628; 20060103811, and 2006/0187421). In some embodiments, one ormore projectors130 may be modified to become image stabilizedprojectors130. Examples ofsuch projectors130 have been described (e.g., U.S. Pat. Nos. 6,002,505; 6,764,185; 6,811,264; 7,036,936; 6,626,543; 7,134,078; 7,355,584; U.S. Published Patent Application No.: 2007/0109509).
Projectors130 may be configured to project numerous wavelengths of light. In some embodiments, aprojector130 may be configured to project ultraviolet light. In some embodiments, aprojector130 may be configured to project visible light. In some embodiments, aprojector130 may be configured to project infrared light. In some embodiments, aprojector130 may be configured to project numerous combinations of light. For example, in some embodiments, aprojector130 may project one or more infrared calibration images and one or more visible images.
Motion Response ModuleIn some embodiments,system100 may include one or moremotion response modules190. In some embodiments, one or moremotion response modules190 may be operably associated with one ormore projectors130. In some embodiments, one or moremotion response modules190 may be operably associated with one or moreprojector control units120. In some embodiments, one or moremotion response modules190 may be operably associated with one ormore sensors150. In some embodiments, one or moremotion response modules190 may be operably associated with one or moresensor control units170. In some embodiments, one or moremotion response modules190 may be operably associated with one ormore response processors191. In some embodiments, one or moremotion response modules190 may be operably associated withresponse processor memory192. In some embodiments, one or moremotion response modules190 may be operably associated with one or moreresponse processor instructions193. In some embodiments, one or moremotion response modules190 may be operably associated withresponse memory194. In some embodiments, one or moremotion response modules190 may be operably associated with one ormore response instructions195. In some embodiments, one or moremotion response modules190 may be operably associated with one or moreresponse motion patterns196. In some embodiments, amotion response module190 may be configured to modulate output from aprojector130 in response to motion that is imparted to adevice105 that includes theprojector130. For example, in some embodiments, amotion response module190 may include one ormore motors198 that are operably coupled to one ormore actuators197 that control one or more lenses. Accordingly, in some embodiments, one or moremotion response modules190 may focus output from aprojector130 in response to motion imparted to adevice105 that includes the image stabilizedprojector130.Motion response modules190 may be configured in numerous conformations to modulate output from an operably associatedprojector130.
Projector Control UnitSystem100 may include one or moreprojector control units120. In some embodiments, one or moreprojector control units120 may be operably associated with one ormore projectors130. In some embodiments, one or moreprojector control units120 may be operably associated with one or moremotion response modules190. In some embodiments, one or moreprojector control units120 may be operably associated with one ormore projectors130 and one or moremotion response modules190. In some embodiments, aprojector control unit120 may be operably associated with one ormore control processors121. In some embodiments, aprojector control unit120 may be operably associated withcontrol processor memory122. In some embodiments, aprojector control unit120 may be operably associated with one or morecontrol processor instructions123. In some embodiments, aprojector control unit120 may be operably associated withcontrol memory124. In some embodiments, aprojector control unit120 may be operably associated with one ormore control instructions125. In some embodiments, aprojector control unit120 may be operably associated with one or morecontrol calibration images126. In some embodiments, aprojector control unit120 may be operably associated with one or morecontrol motion patterns127. In some embodiments, aprojector control unit120 may be configured to modulate output projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may be configured to select one or more wavelengths of light that will be projected by one ormore projectors130. For example, in some embodiments, one or moreprojector control units120 may select one or more wavelengths of ultraviolet light that will be projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may select one or more wavelengths of visible light that will be projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may select one or more wavelengths of infrared light that will be projected by one ormore projectors130. Accordingly, in some embodiments, one or moreprojector control units120 may select numerous wavelengths of light that will be projected by one ormore projectors130.
In some embodiments, one or moreprojector control units120 may select content that is to be projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may select content that is to be projected in response to one or more features associated with one or more projection surfaces200. For example, in some embodiments, one or moreprojector control units120 may select content that is to be projected in response to motion. In some embodiments, one or moreprojector control units120 may select content that is to be projected in response to motion associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may select content that is not to be projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may select content that is not to be projected in response to one or more features associated with one or more projection surfaces200. For example, in some embodiments, one or moreprojector control units120 may select content that is not to be projected in response to motion. In some embodiments, one or moreprojector control units120 may select content that is not to be projected in response to motion associated with one or more projection surfaces200.
In some embodiments, one or moreprojector control units120 may modulate output that is projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may modulate the intensity of light that is projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may modulate the brightness of light that is projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may modulate the contrast of light that is projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may modulate the sharpness of light that is projected by one ormore projectors130.
In some embodiments, one or moreprojector control units120 may modulate the direction of output that is projected by one ormore projectors130. In some embodiments, one or moreprojector control units120 may direct output from one ormore projectors130 onto one or more moving projection surfaces200. In some embodiments, one or moreprojector control units120 may direct output from one ormore projectors130 onto one or more stationary projection surfaces200. In some embodiments, one or moreprojector control units120 may direct output from one ormore projectors130 onto one or more movingprojection surfaces200 and onto one or more stationary projection surfaces200. In some embodiments, one or moreprojector control units120 may direct output from one ormore projectors130 onto multiple projection surfaces200. For example, in some embodiments, one or moreprojector control units120 may direct output from one ormore projectors130 onto afirst projection surface200 and direct output from one ormore projectors130 onto asecond projection surface200.
In some embodiments, one or moreprojector control units120 may dynamically modulate output from one ormore projectors130. For example, in some embodiments, one ormore projectors130 may be carried from room to room such that one or moreprojector control units120 modulate output from the one ormore projectors130 in response to theavailable projection surface200. In some embodiments, one or moreprojector control units120 may dynamically modulate output from one ormore projectors130 in an outdoor environment. For example, in some embodiments, one ormore projectors130 may be configured to project one or more images in response to changing terrain.
In some embodiments, one or moreprojector control units120 may be configured to respond to one or more substantially defined motions. In some embodiments, a user600 may program one or moreprojector control units120 to correlate one or more substantially defined motions with one or more projection commands. For example, in some embodiments, a user600 may program one or moreprojector control units120 to correlate clockwise motion ofdevice105 with a command to advance a projected slide presentation by one slide. Accordingly, in some embodiments, adevice105 may be configured to project in response to substantially defined motions that are programmed according to the preferences of an individual user600.
Sensor Control UnitSystem100 may include one or moresensor control units170. In some embodiments, one or moresensor control units170 may be operably associated with one ormore devices105. In some embodiments, one or moresensor control units170 may be operably associated with one ormore sensors150. In some embodiments, one or moresensor control units170 may be operably associated with one ormore projectors130. In some embodiments, one or moresensor control units170 may be operably associated withsystem memory140. In some embodiments, one or moresensor control units170 may be operably associated with one ormore sensor processors171. In some embodiments, one or moresensor control units170 may be operably associated withsensor processor memory172. In some embodiments, one or moresensor control units170 may be operably associated with one or moresensor processor instructions173. In some embodiments, one or moresensor control units170 may be operably associated withsensor memory176. In some embodiments, one or moresensor control units170 may be operably associated with one ormore sensor instructions177. In some embodiments, one or moresensor control units170 may be operably associated with one or moresensor motion patterns174.
In some embodiments, one or moresensor control units170 may signal a change in sensor response to one or more associated systems. For example, in some embodiments, a change in ambient light signal from one or more ambientlight sensors164,range sensors165,motion sensors151, orother sensors150 alone or in combination can be stored in memory for future use and/or be signaled to one or more image stabilizedprojectors130 where the change in ambient light may trigger a change inprojector130 output intensity. In some embodiments, one or moresensor control units170 may use prior sensor response, user input, or other stimulus, to activate or deactivate one ormore sensors150 or other subordinate features contained within one or moresensor control units170. In some embodiments, one or moresensor control units170 may use prior sensor response, user input, or other stimulus, to activate or deactivate one ormore sensors150 or other subordinate features contained within one or moresensor control units170.
SensorSystem100 may include one ormore sensors150. In some embodiments, one ormore sensors150 may be operably associated with one ormore devices105. In some embodiments, one ormore sensors150 may be operably associated with one or moresensor control units170. In some embodiments, one ormore sensors150 may be operably associated withsystem memory140. In some embodiments, one ormore sensors150 may be operably associated with one or more user interfaces300. In some embodiments, one ormore sensors150 may be operably associated with one ormore projectors130. In some embodiments, one ormore sensors150 may be operably associated with one or moreprojector control units120. In some embodiments, one ormore sensors150 may be operably associated with one or moremotion response modules190. In some embodiments, one ormore sensors150 may be operably associated with one or more housings110.
Adevice105 may include many types ofsensors150 alone or in combination. Examples ofsensors150 include, but are not limited to,cameras163,light sensors164,range sensors165,contact sensors166,entity sensors159,infrared sensors160,yaw rate sensors161,ultraviolet sensors162,inertial sensors155,ultrasonic sensors156,imaging sensors157,pressure sensors158,motion sensors151,gyroscopic sensors152,acoustic sensors153,biometric sensors154, and the like.
In some embodiments, one ormore sensors150 may be configured to detect motion. In some embodiments, one ormore sensors150 may be configured to detect motion that is imparted to one or more projection surfaces200. In some embodiments, one ormore sensors150 may be configured to detect motion that is imparted to one ormore devices105 that include the one ormore sensors150. Accordingly, in some embodiments, one ormore sensors150 that are configured to detect motion may be operably associated with one ormore projectors130 to facilitate modulation of projection output in response to motion. In some embodiments, one ormore sensors150 may be associated with one ormore projectors130 through one or moreprojector control units120. In some embodiments, one ormore sensors150 may be associated with one ormore projectors130 through one or moremotion response modules190. In some embodiments, one ormore sensors150 may be associated with one ormore projectors130 through or independent of one or moresensor control units170.
Interface ModuleSystem100 may include one ormore interface modules180. In some embodiments, one ormore interface modules180 may be operably associated with one ormore devices105. In some embodiments, one ormore interface modules180 may be operably associated with one ormore projectors130. In some embodiments, one ormore interface modules180 may be operably associated with one or moreprojector control units120. In some embodiments, one ormore interface modules180 may be operably associated with one or moremotion response modules190. In some embodiments, one ormore interface modules180 may be operably associated with one ormore sensors150. In some embodiments, one ormore interface modules180 may be operably associated with one or moresensor control units170. In some embodiments, one ormore interface modules180 may be operably associated with one or moreexternal devices400. In some embodiments, one ormore interface modules180 may be operably associated withexternal memory500. In some embodiments, one ormore interface modules180 may be operably associated with one or more user interfaces300.
Aninterface module180 may communicate with other components ofsystem100 through use of numerous communication formats and combinations of communications formats. Examples of such formats include, but are not limited to,VGA181,USB185,wireless USB189, RS-232182, infrared186,Bluetooth 18A, 802.11b/g/n183, S-video187,Ethernet184, DVI-D188, and the like. In some embodiments, aninterface module180 may include one ormore transmitters18B. In some embodiments, aninterface module180 may include one ormore receivers18C.
External DeviceSystem100 may be able to interact with one or moreexternal devices400. Examples of suchexternal devices400 include, but are not limited to,projectors130, recording devices, projection surfaces200, image acquiring surfaces, image printing surfaces (e.g., aprojection surface200 that facilitates the printing or other recordation of content projected on the surface), networks, the internet, wireless devices (e.g., personal digital assistant, cellular telephones, telephones, television transmissions, etc.), receivers, transmitters, and the like.
External MemorySystem100 may be operably associated withexternal memory500. Examples of suchexternal memory500 include, but are not limited to, USB flash drives, memory cards, external hard drives, networked storage, and the like. In some embodiments, display content may be retrieved fromexternal memory500. In some embodiments, sensor data, operational parameters, usage information, or other device or subsystem information can be stored onexternal memory500.
Projection SurfaceSystem100 may include one or more projection surfaces200. In some embodiments, nearly any surface may be utilized as aprojection surface200. In some embodiments, aprojection surface200 may be portable. In some embodiments, aprojection surface200 may be carried by an individual person. For example, in some embodiments, aprojection surface200 may be configured as a sheet of material, a tablet, two or more sheets of material that may be separated from each other, and the like. Accordingly, in some embodiments, aprojection surface200 may be configured as a sheet of material that a user600 may unfold and place on a surface, such as a desk, wall, floor, ceiling, etc.
In some embodiments, aprojection surface200 may include one ormore surface sensors202 that are associated with theprojection surface200. In some embodiments, aprojection surface200 may include one or moremagnetic surface sensors202. For example, in some embodiments, aprojection surface200 may includemagnetic surface sensors202 that are configured to detect magnetic ink that is applied to theprojection surface200. In some embodiments, aprojection surface200 may include one or morepressure surface sensors202. For example, in some embodiments, aprojection surface200 may includepressure surface sensors202 that are configured to detect pressure that is applied to the projection surface200 (e.g., contact of a stylus with theprojection surface200, contact of a pen with theprojection surface200, contact of a pencil with theprojection surface200, etc.). In some embodiments, aprojection surface200 may include one or moremotion surface sensors202. For example, in some embodiments, aprojection surface200 may includemotion surface sensors202 that are configured to detect movement associated with theprojection surface200. In some embodiments, aprojection surface200 may include one or morestrain surface sensors202. For example, in some embodiments, aprojection surface200 may includestrain surface sensors202 that are configured to detect changes in conformation associated with theprojection surface200. In some embodiments, aprojection surface200 may include one or more positional surface sensors202 (e.g., global positioning surface sensors202). For example, in some embodiments, aprojection surface200 may includepositional surface sensors202 that are configured to detect changes in position associated with theprojection surface200.
Aprojection surface200 may be constructed from numerous types of materials and combinations of materials. Examples of such materials include, but are not limited to, cloth, plastic, metal, ceramics, paper, wood, leather, glass, and the like. In some embodiments, one or more projection surfaces200 may exhibit electrochromic properties. In some embodiments, one or more projection surfaces200 may be coated. For example, in some embodiments, aprojection surface200 may be coated with paint. In some embodiments, aprojection surface200 may include one or more materials that alter light. For example, in some embodiments, aprojection surface200 may convert light (e.g., up-convert light, down-convert light).
In some embodiments, aprojection surface200 may be associated with one or more fiducials. For example, in some embodiments, one or more fluorescent marks may be placed on aprojection surface200. In some embodiments, one or more phosphorescent marks may be placed on aprojection surface200. In some embodiments, one or more magnetic materials may be placed on aprojection surface200. In some embodiments, fiducials may be placed on aprojection surface200 in numerous configurations. For example, in some embodiments, fiducials may be positioned in association with aprojection surface200 such that they form a pattern. In some embodiments, aprojection surface200 may include one or more calibration images.
In some embodiments, aprojection surface200 may include one ormore surface transmitters204. Accordingly, in some embodiments, aprojection surface200 may be configured to transmit one or more signals. Such signals may include numerous types of information. Example of such information may include, but are not limited to, information associated with: one or more positions of one or more projection surfaces200, one or more conformations of one or more projection surfaces200, one or more changes in the position of one or more projection surfaces200, one or more changes in the conformation of one or more projection surfaces200, one or more motions associated with one or more projection surfaces200, one or more changes in the motion of one or more projection surfaces200, and the like.
In some embodiments, aprojection surface200 may include one ormore surface receivers206. Accordingly, in some embodiments, aprojection surface200 may be configured to receive one or more signals. For example, in some embodiments, one ormore surface receivers206 may receive one or more signals that are transmitted by one or morecontrol unit transmitters129.
In some embodiments, aprojection surface200 may include one ormore surface processors208. Accordingly, in some embodiments, asurface processor208 may be configured to process information received from one ormore surface sensors202.
In some embodiments, aprojection surface200 may includesurface memory210. In some embodiments, asurface memory210 may include one or more lookup tables that include correlation information associated with the position of one or more fiducials associated with aprojection surface200 and one or more conformations of theprojection surface200. In some embodiments,surface memory210 may includesurface instructions212. In some embodiments,surface instructions212 may include instructions for aprojection surface200 to transmit one or more signals that indicate that aprojection surface200 has undergone a change in conformation. In some embodiments,surface instructions212 may include instructions for aprojection surface200 to transmit one or more signals that indicate that aprojection surface200 has undergone a change in position. In some embodiments,surface instructions212 may include instructions for aprojection surface200 to transmit one or more signals that indicate that aprojection surface200 has undergone a change in motion.
User InterfaceSystem100 may include one or more user interfaces300. In some embodiments, one or more user interfaces300 may be configured as gestural user interfaces300. In some embodiments, content may be projected in response to substantially specific motion that is imparted to aprojection surface200. For example, in some embodiments, a user600 may rotate aprojection surface200 in a clockwise direction to advance the projection of a slide presentation by one frame. In some embodiments, a user600 may advance the projection of a slide presentation by moving one or more projection surfaces200. In some embodiments,system100 may respond to user600 input acquired through sensor information other than motion. For example, in some embodiments,acoustic sensors153 may be employed for response to voice commands or other auditory signals. In some embodiments,cameras163 or other imaging detectors may use user600 location, user600 gestures, laser pointer location, and/or other information as an input signal. In some embodiments,system100 may include one or more user interfaces300 that are configured as control features. Examples of such control features include, but are not limited to, buttons, switches, track balls, and the like. In some embodiments, a user interface300 may include a projected interface. For example, in some embodiments, a user interface300 may include a projected keyboard.
UserSystem100 may be operated by one or more users600. In some embodiments, a user600 may be human. In some embodiments, a user600 may be a non-human user600. For example, in some embodiments, a user600 may be a computer, a robot, and the like. In some embodiments, a user600 may be proximate tosystem100. In some embodiments, a user600 may be remote fromsystem100.
Following are a series of flowcharts depicting implementations. For ease of understanding, the flowcharts are organized such that the initial flowcharts present implementations via an example implementation and thereafter the following flowcharts present alternate implementations and/or expansions of the initial flowchart(s) as either sub-component operations or additional component operations building on one or more earlier-presented flowcharts. Those having skill in the art will appreciate that the style of presentation utilized herein (e.g., beginning with a presentation of a flowchart(s) presenting an example implementation and thereafter providing additions to and/or further details in subsequent flowcharts) generally allows for a rapid and easy understanding of the various process implementations. In addition, those skilled in the art will further appreciate that the style of presentation used herein also lends itself well to modular and/or object-oriented program design paradigms.
InFIG. 2 and in following figures that include various examples of operations used during performance of the method, discussion and explanation may be provided with respect to any one or combination of the above-described examples ofFIG. 1, and/or with respect to other examples and contexts. However, it should be understood that the operations may be executed in a number of other environments and contexts, and/or modified versions ofFIG. 1. Also, although the various operations are presented in the sequence(s) illustrated, it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently.
After a start operation, theoperational flow200 includes an obtainingoperation210 involving obtaining information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions associated with one or more projection surfaces200 directly. For example, in some embodiments, one or moreprojector control units120 may obtain information from one ormore sensors150. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more projection surfaces200 indirectly. For example, in some embodiments, one or moreprojector control units120 may obtain information from one or moreexternal devices400. In some embodiments, one or moreprojector control units120 may receive one or more signals that include information associated with one or more positions of one or more projection surfaces200 from one or moreexternal devices400. One or moreprojector control units120 may obtain numerous types of information associated with one or more positions of one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with the position of one or more fiducials that are associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more marks associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more calibration images that are associated with one or more projection surfaces200.
After a start operation, theoperational flow200 includes an accessingoperation220 involving accessing content in response to the information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may access content in response to the information related to one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may access content that is included withincontrol memory124. In some embodiments, one or moreprojector control units120 may access content through use of one or moreexternal devices400. In some embodiments, one or moreprojector control units120 may access content that is contained withinexternal memory500. In some embodiments, one or moreprojector control units120 may access content through receipt of one or more signals that include content. Numerous types of content may be accessed. Examples of such content include, but are not limited to, images, text, web-based content, broadcast content, and the like. In some embodiments, one or moreprojector control units120 may access content through use of a lookup table. For example, in some embodiments, one or moreprojector control units120 may access content through comparing one or more positions of one or more projection surfaces200 to one or more positions that are indexed to content within a lookup table.
FIG. 3 illustrates alternative embodiments of the exampleoperational flow200 ofFIG. 2.FIG. 3 illustrates example embodiments where the obtainingoperation210 may include at least one additional operation. Additional operations may include anoperation302,operation304,operation306,operation308, and/oroperation310.
Atoperation302, the obtainingoperation210 may include detecting one or more positions associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be operably associated with one ormore sensors150 that detect one or more positions associated with one or more projection surfaces200. Numerous types ofsensors150 may be used to detect one or more positions of one or more projection surfaces200. For example, in some embodiments, one or morelight sensors164 may be configured to detect light intensity associated with one or more projection surfaces200. In some embodiments, one or morelight sensors164 may be configured to detect reflectivity associated with one or more projection surfaces200. In some embodiments, one or morelight sensors164 may be configured to detect light absorbance associated with one or more projection surfaces200. In some embodiments, one or morelight sensors164 may be configured to detect light transmission associated with one or more projection surfaces200. In some embodiments, one ormore motion sensors151 may be configured to detect motion associated with one or more projection surfaces200. For example, in some embodiments, one ormore motion sensors151 may detect movement of one portion of aprojection surface200 relative to another portion of theprojection surface200 to indicate a change in the position of theprojection surface200. In some embodiments, one ormore cameras163 may be configured to detect one or more positions of one or more projection surfaces200′. For example, in some embodiments, one ormore cameras163 may be configured to detect the position of one or more fiducials associated with one or more projection surfaces200. Accordingly, in some embodiments, one ormore cameras163 may be configured to detect the position of one or more projection surfaces200 through determining one or more positions of one or more fiducials that are associated with the one or more projection surfaces200. In some embodiments, one ormore cameras163 may be configured to detect one or more positions of one or more projection surfaces200 through determining the position of one or more calibration images associated with the one or more projection surfaces200. In some embodiments, one or more calibration images may be associated with one or more projection surfaces200. For example, in some embodiments, one or more calibration images may be stamped onto one or more projection surfaces200. In some embodiments, one or more calibration images may be printed onto one or more projection surfaces200. In some embodiments, one or more calibration images may be projected onto one or more projection surfaces200. Accordingly,sensors150 may be configured in numerous ways to facilitate detection of one or more positions of one or more projection surfaces200.
Atoperation304, the obtainingoperation210 may include obtaining information related to one or more positions associated with the one or more projection surfaces with one or more cameras. In some embodiments, one or moreprojector control units120 may be configured to obtain information related to one or more positions associated with one or more projection surfaces200 with one ormore cameras163. In some embodiments, one ormore cameras163 may be configured to detect one or more positions of one or more projection surfaces200. For example, in some embodiments, one ormore cameras163 may be configured to detect one or more positions of one or more patterns formed by one or more fiducials that are associated with one or more projection surfaces200. In some embodiments, one ormore cameras163 may be configured to detect one or more positions of one or more projection surfaces200 through determining one or more positions of one or more fiducials associated with the one or more projection surfaces200. In some embodiments, one ormore cameras163 may be configured to detect one or more positions of one or more projection surfaces200 through determining the position of one or more calibration images associated with the one or more projection surfaces200. In some embodiments, one or more calibration images may be associated with one or more projection surfaces200. For example, in some embodiments, one or more calibration images may be stamped onto one or more projection surfaces200. In some embodiments, one or more calibration images may be printed onto one or more projection surfaces200. In some embodiments, one or more calibration images may be projected onto one or more projection surfaces200. In some embodiments, one ormore cameras163 may be configured to facilitate projection onto irregular surfaces (e.g., U.S. Pat. No. 6,811,264).
Atoperation306, the obtainingoperation210 may include obtaining information related to one or more positions of one or more fiducials associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to obtain information related to one or more positions of one or more fiducials associated with the one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may receive one or more signals that include information associated with one or more fiducials that are associated with one or more projection surfaces200. In some embodiments, such signals may be transmitted by one ormore surface transmitters204 that are associated with one or more projection surfaces200. Numerous types of fiducials may be used alone or in combination while associated with one or more projection surfaces200. Examples of such fiducials include, but are not limited to, magnetic materials, fluorescent materials, quantum dots, radio-frequency tags, and the like. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more fiducials from one ormore sensors150. For example, in some embodiments, one ormore cameras163 may be configured to detect one or more positions of one or more fiducials that are associated with one or more projection surfaces200.
Atoperation308, the obtainingoperation210 may include obtaining information related to one or more calibration images associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to obtain information related to one or more calibration images associated with the one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be configured to obtain information associated with one or more positions of one or more calibration images that are associated with the one or more projection surfaces200. For example, in some embodiments, one ormore sensors150 may detect one or more calibration images that are associated with one or more projection surfaces200.
Atoperation310, the obtainingoperation210 may include obtaining information related to one or more reflection patterns associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to obtain information related to one or more reflection patterns associated with the one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be configured to determine one or more positions of one or more projection surfaces200 through use of one or more reflection patterns that are associated with the one or more projection surfaces200. For example, in some embodiments, one or more projection surfaces200 may be associated with one or more reflective fiducials that will produce known reflection patterns that correspond to known positions of the one or more projection surfaces200.
FIG. 4 illustrates alternative embodiments of the exampleoperational flow200 ofFIG. 2.FIG. 4 illustrates example embodiments where the obtainingoperation210 may include at least one additional operation. Additional operations may include anoperation402,operation404,operation406,operation408, and/oroperation410.
Atoperation402, the obtainingoperation210 may include obtaining information related to one or more vertical positions associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to obtain information related to one or more vertical positions associated with the one or more projection surfaces200. In some embodiments, the conformation of aprojection surface200 may be changed by folding theprojection surface200. For example, in some embodiments, aprojection surface200 that is a sheet may be folded into a cube. Accordingly, in some embodiments, the vertical position of theprojection surface200 will change in accordance with the size of the cube.
Atoperation404, the obtainingoperation210 may include obtaining information related to one or more horizontal positions associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to obtain information related to one or more horizontal positions associated with the one or more projection surfaces200. In some embodiments, the conformation of aprojection surface200 may be changed by folding theprojection surface200. For example, in some embodiments, aprojection surface200 that is a sheet may be folded in half. Accordingly, in some embodiments, the horizontal position of theprojection surface200 will change in accordance with how theprojection surface200 is folded.
Atoperation406, the obtainingoperation210 may include obtaining information associated with one or more rotational positions associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to obtain information associated with one or more rotational positions associated with the one or more projection surfaces200. For example, in some embodiments, aprojection surface200 may be twisted to alter the rotational position of theprojection surface200.
Atoperation408, the obtainingoperation210 may include receiving one or more signals that include the information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to receive one or more signals that include information related to one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may receive one or more signals that include information associated with one or more positions associated with one or more projection surfaces200 that are transmitted by one ormore surface transmitters204. In some embodiments, one or moreprojector control units120 may receive one or more signals that include information associated with one or more positions associated with one or more projection surfaces200 that are transmitted by one or moreexternal devices400. For example, in some embodiments, one or moreexternal devices400 may be configured to detect one or more positions of one or more projection surfaces200 and transmit one or more signals that include information associated with the one or more positions.
Atoperation410, the obtainingoperation210 may include obtaining information related to one or more projection attributes associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to obtain information related to one or more projection attributes associated with the one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be configured to access memory to determine one or more projection attributes associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be operably associated with one ormore sensors150 that are configured to determine one or more projection attributes associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be configured to receive one or more signals that include information related to one or more projection attributes associated with one or more projection surfaces200. Examples of such projection attributes related to one or more projection surfaces200 include, but are not limited to, reflectivity, light absorbance, light reflection, light transmission, light emission, ability to record projected content, ability to transmit information associated with projected content, and the like. Accordingly, in some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to one or more attributes associated with one or more projection surfaces200. For example, in some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project content that is to be printed if aprojection surface200 is able to facilitate printing of content that is projected onto theprojection surface200. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 not to project content that is confidential if aprojection surface200 is able to facilitate printing of content that is projected onto theprojection surface200. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project one or more wavelengths of light in response to one or more attributes associated with aprojection surface200. For example, in some embodiments, aprojection surface200 may be made of material that transmits one or more wavelengths of light preferentially over other wavelengths of light. Accordingly, in some embodiments, aprojector control unit120 may instruct aprojector130 to emit the one or more wavelengths of light that are preferentially transmitted by aprojection surface200. Accordingly, in some embodiments, one or moreprojector control units120 may control one ormore projectors130 in accordance with projection attributes associated with one or more projection surfaces200.
FIG. 5 illustrates alternative embodiments of the exampleoperational flow200 ofFIG. 2.FIG. 5 illustrates example embodiments where the obtainingoperation210 may include at least one additional operation. Additional operations may include anoperation502, and/oroperation504.
Atoperation502, the obtainingoperation210 may include obtaining information related to one or more capture capabilities associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to obtain information related to one or more capture capabilities associated with the one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be configured to access memory to determine one or more capture capabilities associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be operably associated with one ormore sensors150 that are configured to determine one or more capture capabilities associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be configured to receive one or more signals that include information associated with one or more capture capabilities associated with one or more projection surfaces200. Examples of capture capabilities include, but are not limited to, printing of projected content, transmission of one or more signals that include information associated with projected content, and the like. In some embodiments, one or moreprojector control units120 may control one ormore projectors130 in response to one or more capture capabilities associated with one or more projection surfaces200. For example, in some embodiments, aprojector control unit120 may instruct one ormore projectors130 to project content that is to be printed onto one or more projection surfaces200 that are capable of facilitating printing of the projected content. In some embodiments, aprojector control unit120 may instruct one ormore projectors130 not to project content that is confidential onto one or more projection surfaces200 that are capable of facilitating printing of the projected content.
Atoperation504, the obtainingoperation210 may include obtaining information related to one or more recording attributes associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to obtain information related to one or more recording attributes associated with the one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be configured to access memory to determine one or more recording attributes associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be operably associated with one ormore sensors150 that are configured to determine one or more recording attributes associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be configured to receive one or more signals that include information associated with one or more recording attributes associated with one or more projection surfaces200. Examples of recording attributes include, but are not limited to, permanent recordation of projected content, storage of projected content into memory, and the like. In some embodiments, one or moreprojector control units120 may control one ormore projectors130 in response to one or more recording attributes associated with one or more projection surfaces200. For example, in some embodiments, aprojector control unit120 may instruct one ormore projectors130 to project content that is to be saved into memory onto one or more projection surfaces200 that are capable of recording projected content into memory. In some embodiments, aprojector control unit120 may instruct one ormore projectors130 not to project content that is confidential onto one or more projection surfaces200 that are capable of saving the projected content into memory.
FIG. 6 illustrates alternative embodiments of the exampleoperational flow200 ofFIG. 2.FIG. 6 illustrates example embodiments where the accessingoperation220 may include at least one additional operation. Additional operations may include anoperation602, operation604,operation606,operation608, and/oroperation610.
Atoperation602, the accessingoperation220 may include selecting content in response to the information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to select content in response to the information related to one or more positions associated with one or more projection surfaces200. For example, in some embodiments, one or moreprojector control units120 may select confidential information in response to aprojection surface200 being positioned proximate to a specified individual. In some embodiments, one or moreprojector control units120 may receive information related to one or more positions that are associated with one or more projection surfaces200 from the one or more projection surfaces200 (e.g., one or more signals transmitted from one or more projection surfaces200). In some embodiments, one or moreprojector control units120 may receive information related to one or more positions that are associated with one or more projection surfaces200 from the one ormore sensors150. In some embodiments, one or moreprojector control units120 may select non-confidential information in response to aprojection surface200 being positioned proximate to a group of individuals. In some embodiments, one or moreprojector control units120 may access one or more lookup tables that correlate content with one or more positions of one or more projection surfaces200 in order to select content. In some embodiments, one or moreprojector control units120 may access one or more databases that correlate content with one or more positions of one or more projection surfaces200 in order to select content. Accordingly, in some embodiments, one or moreprojector control units120 may select numerous types of content in response to one or more positions of one or more projection surfaces200.
At operation604, the accessingoperation220 may include accessing content in response to detecting the one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to detecting the one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may be operably associated with one ormore sensors150 that are configured to detect one or more positions of one or more projection surfaces200. For example, in some embodiments, one or moreprojector control units120 may be operably associated with one ormore cameras163 that are configured to detect the position of aprojection surface200. Accordingly, in some embodiments, one or moreprojector control units120 may be operably associated with numerous types of detectors that are configured to detect the position of aprojection surface200.
Atoperation606, the accessingoperation220 may include accessing content in response to the information associated with one or more positions of one or more fiducials associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to the information associated with one or more positions of one or more fiducials associated with the one or more projection surfaces200. For example, in some embodiments, one or more fiducials may be configured to produce one or more reflection patterns that depend upon the position of theprojection surface200. Accordingly, in some embodiments, one or moreprojector control units120 may access content in response to one or more reflection patterns that are produced by one or more fiducials. In some embodiments, one or more fiducials may be operably associated with aprojection surface200 such that the relative positions of the fiducials may be detected to determine the position of theprojection surface200. One or moreprojector control units120 may receive information associated with the position of theprojection surface200 and access content in response to the position of theprojection surface200.
Atoperation608, the accessingoperation220 may include accessing content in response to the information associated with one or more calibration images associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to the information associated with one or more calibration images associated with the one or more projection surfaces200. In some embodiments, one or more calibration images may be projected onto aprojection surface200. In some embodiments, one or more calibration images may be physically associated with theprojection surface200. For example, in some embodiments, one or more calibration images may be printed onto aprojection surface200. In some embodiments, one or more calibration images may be embedded within aprojection surface200. In some embodiments, the position of aprojection surface200 may be determined through determining distortion of a calibration image that is related to the position of theprojection surface200. In some embodiments, the position of aprojection surface200 may be determined through determining the reflection pattern produced by a calibration image that is related to the position of theprojection surface200. Accordingly, in some embodiments, a database may be prepared that correlates the position of aprojection surface200 with calibration images associated with theprojection surface200. In some embodiments, a lookup table may be prepared that correlates the position of aprojection surface200 with calibration images associated with theprojection surface200.
Atoperation610, the accessingoperation220 may include accessing content in response to the information associated with one or more reflection patterns associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to the information associated with one or more reflection patterns associated with the one or more projection surfaces200. In some embodiments, aprojection surface200 may include a reflective coating. In some embodiments, aprojection surface200 may include portions that include a reflective coating. In some embodiments, aprojection surface200 may include reflectors that are associated with theprojection surface200. Accordingly, in some embodiments, reflection from aprojection surface200 may be detected to determine the position of aprojection surface200. In some embodiments, one or more preselected reflection patterns may be correlated with one or more positions of aprojection surface200. In some embodiments, one or more reflection patterns may be used to determine the position of aprojection surface200.
FIG. 7 illustrates alternative embodiments of the exampleoperational flow200 ofFIG. 2.FIG. 7 illustrates example embodiments where the accessingoperation220 may include at least one additional operation. Additional operations may include anoperation702, operation704,operation706,operation708, and/oroperation710.
Atoperation702, the accessingoperation220 may include accessing content in response to the information associated with one or more vertical positions associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to the information associated with one or more vertical positions associated with the one or more projection surfaces200. In some embodiments, the vertical position may be relative to theentire projection surface200. In some embodiments, the vertical position may be relative to one or more portions of theprojection surface200. In some embodiments, aprojection surface200 may exhibit a vertical position that depends upon the position of theprojection surface200. For example, in some embodiments, aprojection surface200 that is a flat sheet may be folded into a cube. Accordingly, in some embodiments, one or moreprojector control units120 may access content in response to the vertical position of at least a portion of aprojection surface200.
At operation704, the accessingoperation220 may include accessing content in response to the information associated with one or more horizontal positions associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to the information associated with one or more horizontal positions associated with the one or more projection surfaces200. In some embodiments, the horizontal position may be relative to theentire projection surface200. In some embodiments, the horizontal position may be relative to one or more portions of theprojection surface200. In some embodiments, aprojection surface200 may exhibit a horizontal position that depends upon the conformation of theprojection surface200. For example, in some embodiments, aprojection surface200 that is a flat sheet may be folded in half. Accordingly, in some embodiments, one or moreprojector control units120 may access content in response to the horizontal position of at least a portion of aprojection surface200.
Atoperation706, the accessingoperation220 may include accessing content in response to the information associated with one or more rotational positions associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to the information associated with one or more rotational positions associated with the one or more projection surfaces200. In some embodiments, the rotational position of the one or more fiducials may be detected to determine the position of theprojection surface200. Accordingly, in some embodiments, one or moreprojector control units120 may access content in response to the rotational position of aprojection surface200.
Atoperation708, the accessingoperation220 may include accessing content in response to receiving one or more signals that include the information associated with the one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to receiving one or more signals that include the information associated with the one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may receive one or more signals from numerous sources. For example, in some embodiments, one or moreprojector control units120 may receive one or more signals that were transmitted by one ormore surface transmitters204. In some embodiments, one or moreprojector control units120 may receive one or more signals that were transmitted by one or moreexternal devices400. In some embodiments, one or moreprojector control units120 may access one or more databases in response to the one or more signals. In some embodiments, one or moreprojector control units120 may access one or more lookup tables in response to the one or more signals. In some embodiments, one or moreprojector control units120 may access one or more broadcast media sources in response to the one or more signals. Accordingly, one or moreprojector control units120 may access content from numerous sources.
Atoperation710, the accessingoperation220 may include accessing content that is to be projected. In some embodiments, one or moreprojector control units120 may be configured to access content that is to be projected. In some embodiments, one or moreprojector control units120 may access information in a manner that depends upon one or more specific positions that are associated with one or more projection surfaces200. For example, in some embodiments, one or moreprojector control units120 may access nonconfidential information if aprojection surface200 is positioned proximate to a group of individuals and may access confidential information if theprojection surface200 is positioned proximate to a specified individual. In some embodiments, one or moreprojector control units120 may accesscontrol memory124 to access content that is to be projected. In some embodiments, one or moreprojector control units120 may accessprojector memory134 to access content that is to be projected. In some embodiments, one or moreprojector control units120 may receive one or more signals that include information associated with content that is to be projected.
FIG. 8 illustrates alternative embodiments of the exampleoperational flow200 ofFIG. 2.FIG. 8 illustrates example embodiments where the accessingoperation220 may include at least one additional operation. Additional operations may include anoperation802,operation804,operation806,operation808, and/oroperation810.
Atoperation802, the accessingoperation220 may include accessing information about content that is not to be projected. In some embodiments, one or moreprojector control units120 may be configured to access information about content that is not to be projected. In some embodiments, one or moreprojector control units120 may access information in a manner that depends upon one or more specific positions that are associated with one or more projection surfaces200. For example, in some embodiments, one or moreprojector control units120 may access information about confidential information that is not to be projected if aprojection surface200 is placed in a face-up position.
Atoperation804, the accessingoperation220 may include accessing content in response to the information associated with one or more projection attributes associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to the information associated with one or more projection attributes associated with the one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may access content in accordance with numerous projection attributes that are associated with one or more projection surfaces200. Examples of such attributes include, but are not limited to, reflectivity, absorbance, ability to preferentially transmit certain wavelengths of light, and the like. For example, in some embodiments, one or moreprojector control units120 may access graphical content in response to information associated with one or more projection surfaces200 that are configured to display graphics. In some embodiments, one or moreprojector control units120 may access textual content in response to information associated with one or more projection surfaces200 that are configured to display text.
Atoperation806, the accessingoperation220 may include accessing content in response to the information associated with one or more capture capabilities associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to the information associated with one or more capture capabilities associated with the one or more projection surfaces200. Examples of capture capabilities include, but are not limited to, printing of projected content, transmission of one or more signals that include information associated with projected content, and the like. In some embodiments, one or moreprojector control units120 may be configured to access content that is nonconfidential in response to information associated with aprojection surface200 that is able to capture information that is projected onto theprojection surface200.
Atoperation808, the accessingoperation220 may include accessing content in response to the information associated with one or more recording attributes associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may be configured to access content in response to the information associated with one or more recording attributes associated with the one or more projection surfaces200. Examples of recording attributes include, but are not limited to, permanent recordation of projected content, storage of projected content into memory, and the like. Accordingly, in some embodiments, one or moreprojector control units120 may access content that is to be recorded into memory if aprojection surface200 is capable of recording the information. In some embodiments, a location (e.g., a coffee shop) may include projection surfaces200 that are capable of recording information andprojection surfaces200 that are not capable of recording information. Accordingly, in some embodiments, one or moreprojector control units120 may be configured to access content that is not to be recorded for projection onto aprojection surface200 that is incapable of recording the information. In some embodiments, one or moreprojector control units120 may be configured to access content that is to be recorded for projection onto aprojection surface200 that is capable of recording the information.
Atoperation810, the accessingoperation220 may include receiving one or more signals that include content. In some embodiments, one or moreprojector control units120 may be configured to receive one or more signals that include content. In some embodiments, one or moreprojector control units120 may receive one or more signals from numerous sources. Examples of such sources include, but are not limited to,external devices400, user interfaces300, and the like.
FIG. 9 illustrates alternative embodiments of the exampleoperational flow200 ofFIG. 2.FIG. 9 illustrates example embodiments where the accessingoperation220 may include at least one additional operation. Additional operations may include anoperation902 and/oroperation904.
Atoperation902, the accessingoperation220 may include receiving one or more signals that include broadcast media. In some embodiments, one or moreprojector control units120 may be configured to receive one or more signals that include broadcast media. For example, in some embodiments, one or moreprojector control units120 may be configured to receive television signals. In some embodiments, one or moreprojector control units120 may be configured to receive radio signals. Accordingly, in some embodiments, one or moreprojector control units120 may be configured to project television content in response to one or more positions associated with one or more projection surfaces200. For example, in some embodiments, aprojection surface200 that is configured for television viewing may be slid into a wall pocket while not in use and pulled from the wall pocket for use.
Atoperation904, the accessingoperation220 may include receiving one or more signals that include web-based media. In some embodiments, one or moreprojector control units120 may be configured to receive one or more signals that include web-based media. For example, in some embodiments, one or moreprojector control units120 may be configured to receive information through connection to the internet. Accordingly, in some embodiments, one or moreprojector control units120 may be configured to access content from the internet in response to one or more positions of one or more projection surfaces200. For example, in some embodiments, one or moreprojector control units120 may be configured to access electronic mail in response to a position associated with aprojection surface200.
InFIG. 10 and in following figures that include various examples of operations used during performance of the method, discussion and explanation may be provided with respect to any one or combination of the above-described examples ofFIG. 1, and/or with respect to other examples and contexts. In some embodiments,modules210 and220 ofFIG. 2 may correspond tomodules1010 and1020 ofFIG. 10. However, it should be understood that the operations may be executed in a number of other environments and contexts, and/or modified versions ofFIG. 1. Also, although the various operations are presented in the sequence(s) illustrated, it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently.
After a start operation, theoperational flow1000 includes an obtainingoperation1010 involving obtaining information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions associated with one or more projection surfaces200 directly. For example, in some embodiments, one or moreprojector control units120 may obtain information from one ormore sensors150. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more projection surfaces200 indirectly. For example, in some embodiments, one or moreprojector control units120 may obtain information from one or moreexternal devices400. In some embodiments, one or moreprojector control units120 may receive one or more signals that include information associated with one or more positions of one or more projection surfaces200 from one or moreexternal devices400. One or moreprojector control units120 may obtain numerous types of information associated with one or more positions of one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with the position of one or more fiducials that are associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more marks associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more calibration images that are associated with one or more projection surfaces200.
After a start operation, theoperational flow1000 includes an accessingoperation1020 involving accessing content in response to the information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may access content in response to the information related to one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may access content that is included withincontrol memory124. In some embodiments, one or moreprojector control units120 may access content through use of one or moreexternal devices400. In some embodiments, one or moreprojector control units120 may access content that is contained withinexternal memory500. In some embodiments, one or moreprojector control units120 may access content through receipt of one or more signals that include content. Numerous types of content may be accessed. Examples of such content include, but are not limited to, images, text, web-based content, broadcast content, and the like. In some embodiments, one or moreprojector control units120 may access content through use of a lookup table. For example, in some embodiments, one or moreprojector control units120 may access content through comparing one or more positions of one or more projection surfaces200 to one or more positions that are indexed to content within a lookup table.
After a start operation, theoperational flow1000 includes a projectingoperation1030 involving projecting in response to the accessing content. In some embodiments, one ormore projectors130 may project in response to the accessing content. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to the accessing content. In some embodiments, one ormore projectors130 may project content that is selected in response to one or more positions of one or more projection surfaces200. In some embodiments, one ormore projectors130 may adjust projection output in response to one or more positions of one or more projection surfaces200. For example, in some embodiments, one ormore projectors130 may adjust the intensity of light that is projected onto one or more surfaces in response to one or more positions of one or more projection surfaces200. In some embodiments, one ormore projectors130 may adjust the wavelengths of light that are projected onto one or more surfaces in response to one or more positions of one or more projection surfaces200. In some embodiments, one ormore projectors130 may project content onto two or moreseparate projection surfaces200 in response to one or more positions of at least one of the two or more projection surfaces200.
FIG. 11 illustrates alternative embodiments of the exampleoperational flow1000 ofFIG. 10.FIG. 11 illustrates example embodiments where the projectingoperation1030 may include at least one additional operation. Additional operations may include anoperation1102,operation1104,operation1106,operation1108, and/oroperation1110.
Atoperation1102, the projectingoperation1030 may include projecting one or more images. In some embodiments, one ormore projectors130 may project one or more images. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project one or more images in response to one or more positions of one or more projection surfaces200. For example, in some embodiments, one ormore projectors130 may project one or more pictures onto aprojection surface200 in response to theprojection surface200 being placed onto a tabletop.
Atoperation1104, the projectingoperation1030 may include projecting text. In some embodiments, one ormore projectors130 may project text. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project text in response to one or more positions of one or more projection surfaces200. For example, in some embodiments, one ormore projectors130 may project text onto aprojection surface200 in response to theprojection surface200 being hung on a wall.
Atoperation1106, the projectingoperation1030 may include projecting broadcast media. In some embodiments, one ormore projectors130 may project broadcast media. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project broadcast media in response to one or more positions of one or more projection surfaces200. For example, in some embodiments, one ormore projectors130 may project a television program onto aprojection surface200 in response to theprojection surface200 being hung on a wall.
Atoperation1108, the projectingoperation1030 may include projecting instructions. In some embodiments, one ormore projectors130 may project instructions. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project instructions in response to one or more positions of one or more projection surfaces200. For example, in some embodiments, one ormore projectors130 may project instructions onto an automobile motor in response to one or more positions of the automobile motor. Accordingly, in some embodiments,system100 may be configured for use in assembly processes. For example, in some embodiments,system100 may be configured to be installed on an assembly line and project images and/or instructions to assist workers. In some embodiments,system100 may be configured for medical use. In some embodiments, one ormore projectors130 may be instructed to project one or more images and/or instructions during a surgical procedure. For example, in some embodiments, one ormore projectors130 may be instructed to project content in response to one or more positions of a human body.
Atoperation1110, the projectingoperation1030 may include projecting web-based media. In some embodiments, one ormore projectors130 may project web-based media. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project web-based media in response to one or more positions of one or more projection surfaces200. For example, in some embodiments, one ormore projectors130 may project electronic mail onto aprojection surface200 in response to theprojection surface200 being placed on a tabletop. In some embodiments, one ormore projectors130 may project a web browser onto aprojection surface200 in response to theprojection surface200 being hung on a wall.
InFIG. 12 and in following figures that include various examples of operations used during performance of the method, discussion and explanation may be provided with respect to any one or combination of the above-described examples ofFIG. 1, and/or with respect to other examples and contexts. In some embodiments,modules1010 and1020 ofFIG. 10 may correspond tomodules1210, and1220 ofFIG. 12. However, it should be understood that the operations may be executed in a number of other environments and contexts, and/or modified versions ofFIG. 1. Also, although the various operations are presented in the sequence(s) illustrated, it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently.
After a start operation, theoperational flow1200 includes an obtainingoperation1210 involving obtaining information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions associated with one or more projection surfaces200 directly. For example, in some embodiments, one or moreprojector control units120 may obtain information from one ormore sensors150. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more projection surfaces200 indirectly. For example, in some embodiments, one or moreprojector control units120 may obtain information from one or moreexternal devices400. In some embodiments, one or moreprojector control units120 may receive one or more signals that include information associated with one or more positions of one or more projection surfaces200 from one or moreexternal devices400. One or moreprojector control units120 may obtain numerous types of information associated with one or more positions of one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with the position of one or more fiducials that are associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more marks associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more calibration images that are associated with one or more projection surfaces200.
After a start operation, theoperational flow1200 includes an accessingoperation1220 involving accessing content in response to the information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may access content in response to the information related to one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may access content that is included withincontrol memory124. In some embodiments, one or moreprojector control units120 may access content through use of one or moreexternal devices400. In some embodiments, one or moreprojector control units120 may access content that is contained withinexternal memory500. In some embodiments, one or moreprojector control units120 may access content through receipt of one or more signals that include content. Numerous types of content may be accessed. Examples of such content include, but are not limited to, images, text, web-based content, broadcast content, and the like. In some embodiments, one or moreprojector control units120 may access content through use of a lookup table. For example, in some embodiments, one or moreprojector control units120 may access content through comparing one or more positions of one or more projection surfaces200 to one or more positions that are indexed to content within a lookup table.
After a start operation, theoperational flow1200 includes a coordinatingoperation1230 involving coordinating one or more positions associated with the one or more projection surfaces with one or more commands. In some embodiments, one ormore projector processors131 may facilitate coordinating one or more positions associated with the one or more projection surfaces200 with one or more commands. Examples of commands include, but are not limited to, commands to: increase light output from one ormore projectors130, decrease light output from one ormore projectors130, select one or more wavelengths of light for projection, select one or more wavelengths of light that are not to be projected, direct projection outputs, project in response to position, project in response to the position of one or more marks associated with one or more projection surfaces200, select content for projection, select content that is not to be projected, project in response to one or more attributes associated with one or more projection surfaces200, project in response to one or more capabilities associated with one or more projection surfaces200, save content into memory, and the like. In some embodiments, one or moreprojector control units120 may access memory. For example, in some embodiments, one or moreprojector control units120 may access one or more lookup tables that include correlations of one or more positions of one or more projection surfaces200 with one or more commands. In some embodiments, one or moreprojector control units120 may access one or more algorithms that may be used to correlate one or more positions of one or more projection surfaces200 with one or more commands.
FIG. 13 illustrates alternative embodiments of the exampleoperational flow1200 ofFIG. 12.FIG. 13 illustrates example embodiments where the coordinatingoperation1230 may include at least one additional operation. Additional operations may include anoperation1302,operation1304,operation1306,operation1308, and/oroperation1310.
Atoperation1302, the coordinatingoperation1230 may include coordinating one or more positions associated with the one or more projection surfaces with one or more projection commands. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions associated with the one or more projection surfaces200 with one or more projection commands. For example, in some embodiments, one or moreprojector control units120 may facilitate coordinating light transmission that is associated with one or more positions of one or more projection surfaces200 with one or more projection commands. In some embodiments, one or moreprojector control units120 may facilitate coordinating light absorbance that is associated with one or more positions of one or more projection surfaces200 with one or more projection commands. Accordingly, in some embodiments, one or moreprojector control units120 may alter the intensity of light that is projected onto the one or more projection surfaces200 in response to the light absorbance of the one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more fiducials that are associated with one or more projection surfaces200 with one or more projection commands. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more calibration images that are associated with one or more projection surfaces200 with one or more projection commands.
Atoperation1304, the coordinatingoperation1230 may include coordinating one or more positions associated with the one or more projection surfaces with one or more content packets. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions associated with the one or more projection surfaces200 with one or more content packets. In some embodiments, one or moreprojector control units120 may facilitate accessing one or more content packets. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more content packets. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more content packets that include specified information. For example, in some embodiments, one or more lookup tables may include information for coordinating one or more specified positions of one or more projection surfaces200 with one or more commands to access one or more content packets that include specified information. Accordingly, in some embodiments, one or more specified positions may be coordinated with specified information. In some embodiments, one or more lookup tables may include information for coordinating one or more specified positions of one or more projection surfaces200 with one or more commands to access one or more specified content packets. Accordingly, in some embodiments, one or more specified positions may be coordinated with one or more specified content packets.
Atoperation1306, the coordinatingoperation1230 may include coordinating one or more positions associated with the one or more projection surfaces with one or more commands associated with content. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions associated with the one or more projection surfaces200 with one or more commands associated with content. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more projection surfaces200 with one or more commands to select content for projection. For example, in some embodiments, placing aprojection surface200 onto a tabletop in a specified location may be coordinated with one or more commands to select confidential information for projection. In some embodiments, placing aprojection surface200 onto a tabletop in a specified location may be coordinated with one or more commands to select nonconfidential information for projection. In some embodiments, a user600 may specify one or more positions that may be coordinated with one or more commands to select content for projection. For example, a user600 may specify that aprojection surface200 that is placed proximate to a specified individual is to be coordinated with one or more commands to select confidential information for projection. Accordingly, in some embodiments, numerous positions of aprojection surface200 may be coordinated with one or more commands to select content for projection.
Atoperation1308, the coordinatingoperation1230 may include accessing one or more databases. In some embodiments, one or moreprojector control units120 may facilitate accessing one or more databases. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more databases. For example, in some embodiments, one or moreprojector control units120 may facilitate accessing one or more databases that include confidential material in response to one or more positions of one or more projection surfaces200. In some embodiments, altering the position of aprojection surface200 may result in different databases being accessed. Accordingly, in some embodiments, one or moreprojector control units120 may facilitate accessing one or more databases in response to one or more specified positions of aprojection surface200.
Atoperation1310, the coordinatingoperation1230 may include accessing one or more lookup tables. In some embodiments, one or moreprojector control units120 may facilitate accessing one or more lookup tables. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more lookup tables. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more lookup tables that include information for coordinating the one or more positions with one or more commands. For example, in some embodiments, one or more lookup tables may include information for coordinating one or more specified positions of one or more projection surfaces200 with one or more commands to select content for projection. In some embodiments, one or more lookup tables may include information for coordinating one or more specified positions of one or more projection surfaces200 with one or more commands to select content that is not for projection. In some embodiments, one or moreprojector control units120 may facilitate accessing one or more lookup tables in response to one or more positions of one or more projection surfaces200. For example, in some embodiments, changing the position of aprojection surface200 from one position to another may result in different lookup tables being accessed. Accordingly, in some embodiments, one or moreprojector control units120 may facilitate accessing one or more lookup tables in response to one or more specified positions of aprojection surface200.
FIG. 14 illustrates alternative embodiments of the exampleoperational flow1200 ofFIG. 12.FIG. 14 illustrates example embodiments where the coordinatingoperation1230 may include at least one additional operation. Additional operations may include anoperation1402,operation1404,operation1406, operation1408,operation1410, and/oroperation1412.
Atoperation1402, the coordinatingoperation1230 may include accessing one or more content packets. In some embodiments, one or moreprojector control units120 may facilitate accessing one or more content packets. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more content packets. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more content packets that include specified information. For example, in some embodiments, one or more lookup tables may include information for coordinating one or more specified positions of one or more projection surfaces200 with one or more commands to access one or more content packets that include specified information. Accordingly, in some embodiments, one or more specified positions may be coordinated with specified information. In some embodiments, one or more lookup tables may include information for coordinating one or more specified positions of one or more projection surfaces200 with one or more commands to access one or more specified content packets. Accordingly, in some embodiments, one or more specified positions may be coordinated with one or more specified content packets.
Atoperation1404, the coordinatingoperation1230 may include coordinating one or more positions associated with the one or more projection surfaces with one or more commands to select the content for projection. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions associated with one or more projection surfaces200 with one or more commands to select the content for projection. For example, in some embodiments, placing aprojection surface200 in a vertical position may be coordinated with one or more commands to select content for projection. In some embodiments, placing aprojection surface200 in a horizontal position may be coordinated with one or more commands to select content for projection. Accordingly, in some embodiments, numerous positions of aprojection surface200 may be coordinated with one or more commands to select content for projection.
Atoperation1406, the coordinatingoperation1230 may include coordinating one or more positions associated with the one or more projection surfaces with one or more commands to select the content that is not for projection. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions associated with one or more projection surfaces200 with one or more commands to select the content that is not for projection. For example, in some embodiments, the position of one or more projection surfaces200 may be coordinated with one or more commands to select confidential information that is not for projection. In some embodiments, placing aprojection surface200 in a vertical position may be coordinated with one or more commands to select information that is not for projection. In some embodiments, a user600 may specify one or more positions of one or more projection surfaces200 that may be coordinated with one or more commands to select content that is not for projection. For example, a user600 may specify that aprojection surface200 in a horizontal position is to be coordinated with one or more commands to select confidential information that is not for projection. Accordingly, in some embodiments, numerous positions of aprojection surface200 may be coordinated with one or more commands to select content that is not for projection.
At operation1408, the coordinatingoperation1230 may include coordinating one or more positions associated with the one or more projection surfaces with one or more recording attributes associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more positions associated with one or more projection surfaces200 with one or more recording attributes associated with the one or more projection surfaces200. For example, in some embodiments, one or more projection surfaces200 may be placed in a vertical position to indicate that theprojection surface200 is enabled to save content that is projected onto theprojection surface200 into memory. In some embodiments, one or more projection surfaces200 may be placed into a horizontal position to indicate that theprojection surface200 is not enabled to save content that is projected onto theprojection surface200 into memory. Accordingly, numerous positions may be coordinated with numerous recording attributes that may be associated with one or more projection surfaces200. Examples of such recording attributes include, but are not limited to: saving projected content into memory, facilitating printing of projected content, transmitting one or more signals that include information associated with projected content, and the like.
Atoperation1410, the coordinatingoperation1230 may include coordinating one or more recording attributes associated with the one or more projection surfaces with the content that is to be projected. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more recording attributes associated with one or more projection surfaces200 with the content that is to be projected. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more recording attributes associated with one or more projection surfaces200 with content that is to be recorded into memory. For example, in some embodiments, the ability of one or more projection surfaces200 to facilitate saving content that is projected onto theprojection surface200 into memory may be coordinated with content that is to be projected on theprojection surface200 and saved into memory. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more recording attributes associated with one or more projection surfaces200 with content that is to be printed. For example, in some embodiments, the ability of one or more projection surfaces200 to facilitate printing of content that is projected onto theprojection surface200 may be coordinated with content that is to be projected onto theprojection surface200 and printed.
Atoperation1412, the coordinatingoperation1230 may include coordinating one or more recording attributes associated with the one or more projection surfaces with the content that is not to be projected. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more recording attributes associated with one or more projection surfaces200 with the content that is not to be projected. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more recording attributes associated with one or more projection surfaces200 with content that is not to be recorded into memory. For example, in some embodiments, the ability of one or more projection surfaces200 to facilitate saving content that is projected onto theprojection surface200 into memory may be coordinated with content that is not to be projected onto theprojection surface200. In some embodiments, one or moreprojector control units120 may facilitate coordinating one or more recording attributes associated with one or more projection surfaces200 with content that is not to be printed. For example, in some embodiments, the ability of one or more projection surfaces200 to facilitate printing of content that is projected onto theprojection surface200 may be coordinated with content that is to be projected onto theprojection surface200.
InFIG. 15 and in following figures that include various examples of operations used during performance of the method, discussion and explanation may be provided with respect to any one or combination of the above-described examples ofFIG. 1, and/or with respect to other examples and contexts. In some embodiments,modules1210,1220, and1230 ofFIG. 12 may correspond tomodules1510,1520, and1530 ofFIG. 15. However, it should be understood that the operations may be executed in a number of other environments and contexts, and/or modified versions ofFIG. 1. Also, although the various operations are presented in the sequence(s) illustrated, it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently.
After a start operation, theoperational flow1500 includes an obtainingoperation1510 involving obtaining information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions associated with one or more projection surfaces200 directly. For example, in some embodiments, one or moreprojector control units120 may obtain information from one ormore sensors150. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more projection surfaces200 indirectly. For example, in some embodiments, one or moreprojector control units120 may obtain information from one or moreexternal devices400. In some embodiments, one or moreprojector control units120 may receive one or more signals that include information associated with one or more positions of one or more projection surfaces200 from one or moreexternal devices400. One or moreprojector control units120 may obtain numerous types of information associated with one or more positions of one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with the position of one or more fiducials that are associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more marks associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may obtain information associated with one or more positions of one or more calibration images that are associated with one or more projection surfaces200.
After a start operation, theoperational flow1500 includes an accessingoperation1520 involving accessing content in response to the information related to one or more positions associated with one or more projection surfaces. In some embodiments, one or moreprojector control units120 may access content in response to the information related to one or more positions associated with one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may access content that is included withincontrol memory124. In some embodiments, one or moreprojector control units120 may access content through use of one or moreexternal devices400. In some embodiments, one or moreprojector control units120 may access content that is contained withinexternal memory500. In some embodiments, one or moreprojector control units120 may access content through receipt of one or more signals that include content. Numerous types of content may be accessed. Examples of such content include, but are not limited to, images, text, web-based content, broadcast content, and the like. In some embodiments, one or moreprojector control units120 may access content through use of a lookup table. For example, in some embodiments, one or moreprojector control units120 may access content through comparing one or more positions of one or more projection surfaces200 to one or more positions that are indexed to content within a lookup table.
After a start operation, theoperational flow1500 includes a coordinatingoperation1530 involving coordinating one or more positions associated with the one or more projection surfaces with one or more commands. In some embodiments, one ormore projector processors131 may facilitate coordinating one or more positions associated with the one or more projection surfaces200 with one or more commands. Examples of commands include, but are not limited to, commands to: increase light output from one ormore projectors130, decrease light output from one ormore projectors130, select one or more wavelengths of light for projection, select one or more wavelengths of light that are not to be projected, direct projection outputs, project in response to position, project in response to the position of one or more marks associated with one or more projection surfaces200, select content for projection, select content that is not to be projected, project in response to one or more attributes associated with one or more projection surfaces200, project in response to one or more capabilities associated with one or more projection surfaces200, save content into memory, and the like. In some embodiments, one or moreprojector control units120 may access memory. For example, in some embodiments, one or moreprojector control units120 may access one or more lookup tables that include correlations of one or more positions of one or more projection surfaces200 with one or more commands. In some embodiments, one or moreprojector control units120 may access one or more algorithms that may be used to correlate one or more positions of one or more projection surfaces200 with one or more commands.
After a start operation, theoperational flow1500 includes a projectingoperation1540 involving projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more commands. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to the coordinating one or more positions associated with one or more projection surfaces200 with one or more commands. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project content in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to select the content. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project content that was selected in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to select and project the content. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to not project content in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to select content that is not for projection.
FIG. 16 illustrates alternative embodiments of the exampleoperational flow1500 ofFIG. 15.FIG. 16 illustrates example embodiments where the projectingoperation1540 may include at least one additional operation. Additional operations may include anoperation1602,operation1604,operation1606,operation1608, and/oroperation1610.
Atoperation1602, the projectingoperation1540 may include projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more projection commands. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to the coordinating one or more positions associated with the one or more projection surfaces200 with one or more projection commands. For example, in some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to increase the intensity of light projected by one ormore projectors130 in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to alter the intensity of projected light. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to direct projection output onto one or more projection surfaces200 in response to coordinating one or more positions of the one or more projection surfaces200 with one or more commands to direct the projection output onto the one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project one or more wavelengths of light in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to select one or more wavelengths of light for projection that are matched to the light transmission characteristics of the one or more projection surfaces200.
Atoperation1604, the projectingoperation1540 may include projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more content packets. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to coordinating one or more positions associated with the one or more projection surfaces200 with one or more content packets. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to access one or more content packets in response to one or more positions of one or more projection surfaces200. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project content included within one or more content packets in response to one or more positions of one or more projection surfaces200.
Atoperation1606, the projectingoperation1540 may include projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more commands associated with content. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to coordinating one or more positions associated with the one or more projection surfaces200 with one or more commands associated with content. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project specific content in response to coordinating one or more positions of one or more projection surfaces200 with the specific content. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to access content that is included within memory. For example, in some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to accessprojector memory134. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to accesscontrol memory124. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to access memory that is associated with anexternal device400.
Atoperation1608, the projectingoperation1540 may include projecting in response to accessing one or more databases. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to accessing one or more databases. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project content in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more databases that contain the content. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project confidential information in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more databases that contain the confidential information.
Atoperation1610, the projectingoperation1540 may include projecting in response to accessing one or more lookup tables. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to accessing one or more lookup tables. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project content in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more lookup tables associated with the content.
FIG. 17 illustrates alternative embodiments of the exampleoperational flow1500 ofFIG. 15.FIG. 17 illustrates example embodiments where the projectingoperation1540 may include at least one additional operation. Additional operations may include anoperation1702,operation1704,operation1706,operation1708,operation1710, and/oroperation1712.
Atoperation1702, the projectingoperation1540 may include projecting in response to accessing one or more content packets. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to accessing one or more content packets. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project content in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to access one or more content packets that include the content.
Atoperation1704, the projectingoperation1540 may include projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more commands to select content for projection. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to the coordinating one or more positions associated with the one or more projection surfaces200 with one or more commands to select content for projection. For example, in some embodiments, one ormore projectors130 may be instructed to project confidential information in response to aprojection surface200 being placed proximate to a specified individual. Accordingly, in some embodiments, aprojector130 may be instructed to project specific content in a manner that depends upon the position of aprojection surface200.
Atoperation1706, the projectingoperation1540 may include projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more commands to select content that is not for projection. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to the coordinating one or more positions associated with the one or more projection surfaces200 with one or more commands to select content that is not for projection. For example, in some embodiments, one ormore projectors130 may be instructed to access one or more content packets that include confidential and nonconfidential information. Accordingly, in some embodiments, the one ormore projectors130 may be instructed not to project the confidential information. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 not to project content in response to coordinating one or more positions of one or more projection surfaces200 with one or more commands to select content that is not for projection. For example, in some embodiments, one ormore projectors130 may be instructed to access one or more content packets that include confidential and nonconfidential information. Accordingly, in some embodiments, the one ormore projectors130 may be instructed not to project the confidential information.
Atoperation1708, the projectingoperation1540 may include projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more recording attributes associated with the one or more projection surfaces. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to the coordinating one or more positions associated with the one or more projection surfaces200 with one or more recording attributes associated with the one or more projection surfaces200. In some embodiments, one or more projection surfaces200 may be configured to record content that is projected onto theprojection surface200 when theprojection surface200 is in a specified position. For example, in some embodiments, aprojection surface200 may be able to record content that is projected onto theprojection surface200 when theprojection surface200 is placed in a vertical position. In some embodiments, aprojection surface200 may be unable to record content that is projected onto theprojection surface200 when theprojection surface200 is placed in a horizontal position.
Atoperation1710, the projectingoperation1540 may include projecting in response to the coordinating one or more recording attributes associated with the one or more projection surfaces with content that is to be projected. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to the coordinating one or more recording attributes associated with one or more projection surfaces200 with content that is to be projected. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project content in response to coordinating one or more recording attributes associated with one or more projection surfaces200 with content that is to be recorded into memory. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project content in response to coordinating one or more recording attributes associated with one or more projection surfaces200 with content that is to be printed.
Atoperation1712, the projectingoperation1540 may include projecting in response to coordinating one or more recording attributes associated with the one or more projection surfaces with content that is not to be projected. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 to project in response to coordinating one or more recording attributes associated with the one or more projection surfaces200 with content that is not to be projected. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 not to project content in response to coordinating one or more recording attributes associated with one or more projection surfaces200 with content that is not to be recorded into memory. For example, in some embodiments, one ormore projectors130 may be instructed to access one or more content packets that include confidential and nonconfidential information. Accordingly, in some embodiments, the one ormore projectors130 may be instructed not to project the confidential information. In some embodiments, one or moreprojector control units120 may instruct one ormore projectors130 not to project content in response to coordinating one or more recording attributes associated with one or more projection surfaces200 with content that is not to be printed. For example, in some embodiments, one ormore projectors130 may be instructed to access one or more content packets that include confidential and nonconfidential information. Accordingly, in some embodiments, the one ormore projectors130 may be instructed not to project the confidential information.
FIG. 18 illustrates a partial view of asystem1800 that includes acomputer program1804 for executing a computer process on a computing device. An embodiment ofsystem1800 is provided using a signal-bearing medium1802 bearing one or more instructions for obtaining information related to one or more positions associated with one or more projection surfaces and one or more instructions for accessing content in response to the information related to one or more positions associated with one or more projection surfaces. The one or more instructions may be, for example, computer executable and/or logic-implemented instructions. In some embodiments, the signal-bearing medium1802 may include a computer-readable medium1806. In some embodiments, the signal-bearing medium1802 may include arecordable medium1808. In some embodiments, the signal-bearing medium1802 may include acommunications medium1810.
FIG. 19 illustrates a partial view of asystem1900 that includes acomputer program1904 for executing a computer process on a computing device. An embodiment ofsystem1900 is provided using a signal-bearing medium1902 bearing one or more instructions for obtaining information related to one or more positions associated with one or more projection surfaces, one or more instructions for accessing content in response to the information related to one or more positions associated with one or more projection surfaces, and one or more instructions for projecting in response to accessing content. The one or more instructions may be, for example, computer executable and/or logic-implemented instructions. In some embodiments, the signal-bearing medium1902 may include a computer-readable medium1906. In some embodiments, the signal-bearing medium1902 may include arecordable medium1908. In some embodiments, the signal-bearing medium1902 may include acommunications medium1910.
FIG. 20 illustrates a partial view of asystem2000 that includes acomputer program2004 for executing a computer process on a computing device. An embodiment ofsystem2000 is provided using a signal-bearing medium2002 bearing one or more instructions for obtaining information related to one or more positions associated with one or more projection surfaces, one or more instructions for accessing content in response to the information related to one or more positions associated with one or more projection surfaces, and one or more instructions for coordinating one or more positions associated with the one or more projection surfaces with one or more commands. The one or more instructions may be, for example, computer executable and/or logic-implemented instructions. In some embodiments, the signal-bearing medium2002 may include a computer-readable medium2006. In some embodiments, the signal-bearing medium2002 may include arecordable medium2008. In some embodiments, the signal-bearing medium2002 may include acommunications medium2010.
FIG. 21 illustrates a partial view of asystem2100 that includes acomputer program2104 for executing a computer process on a computing device. An embodiment ofsystem2100 is provided using a signal-bearing medium2102 bearing one or more instructions for obtaining information related to one or more positions associated with one or more projection surfaces, one or more instructions for accessing content in response to the information related to one or more positions associated with one or more projection surfaces, one or more instructions for coordinating one or more positions associated with the one or more projection surfaces with one or more commands, and one or more instructions for projecting in response to the coordinating one or more positions associated with the one or more projection surfaces with one or more commands. The one or more instructions may be, for example, computer executable and/or logic-implemented instructions. In some embodiments, the signal-bearing medium2102 may include a computer-readable medium2106. In some embodiments, the signal-bearing medium2102 may include arecordable medium2108. In some embodiments, the signal-bearing medium2102 may include acommunications medium2110.
All of the above U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in any Application Data Sheet, are incorporated herein by reference, to the extent not inconsistent herewith.
Those having skill in the art will recognize that the state of the art has progressed to the point where there is little distinction left between hardware, software, and/or firmware implementations of aspects of systems; the use of hardware, software, and/or firmware is generally (but not always, in that in certain contexts the choice between hardware and software can become significant) a design choice representing cost vs. efficiency tradeoffs. Those having skill in the art will appreciate that there are various vehicles by which processes and/or systems and/or other technologies described herein can be effected (e.g., hardware, software, and/or firmware), and that the preferred vehicle will vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware. Hence, there are several possible vehicles by which the processes and/or devices and/or other technologies described herein may be effected, none of which is inherently superior to the other in that any vehicle to be utilized is a choice dependent upon the context in which the vehicle will be deployed and the specific concerns (e.g., speed, flexibility, or predictability) of the implementer, any of which may vary. Those skilled in the art will recognize that optical aspects of implementations will typically employ optically-oriented hardware, software, and or firmware.
In some implementations described herein, logic and similar implementations may include software or other control structures suitable to operation. Electronic circuitry, for example, may manifest one or more paths of electrical current constructed and arranged to implement various logic functions as described herein. In some implementations, one or more media are configured to bear a device-detectable implementation if such media hold or transmit a special-purpose device instruction set operable to perform as described herein. In some variants, for example, this may manifest as an update or other modification of existing software or firmware, or of gate arrays or other programmable hardware, such as by performing a reception of or a transmission of one or more instructions in relation to one or more operations described herein. Alternatively or additionally, in some variants, an implementation may include special-purpose hardware, software, firmware components, and/or general-purpose components executing or otherwise invoking special-purpose components. Specifications or other implementations may be transmitted by one or more instances of tangible transmission media as described herein, optionally by packet transmission or otherwise by passing through distributed media at various times.
Alternatively or additionally, implementations may include executing a special-purpose instruction sequence or otherwise invoking circuitry for enabling, triggering, coordinating, requesting, or otherwise causing one or more occurrences of any functional operations described above. In some variants, operational or other logical descriptions herein may be expressed directly as source code and compiled or otherwise invoked as an executable instruction sequence. In some contexts, for example, C++ or other code sequences can be compiled directly or otherwise implemented in high-level descriptor languages (e.g., a logic-synthesizable language, a hardware description language, a hardware design simulation, and/or other such similar mode(s) of expression). Alternatively or additionally, some or all of the logical expression may be manifested as a Verilog-type hardware description or other circuitry model before physical implementation in hardware, especially for basic operations or timing-critical applications. Those skilled in the art will recognize how to obtain, configure, and optimize suitable transmission or computational elements, material supplies, actuators, or other common structures in light of these teachings.
The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link (e.g., transmitter, receiver, transmission logic, reception logic, etc.), etc.).
In a general sense, those skilled in the art will recognize that the various embodiments described herein can be implemented, individually and/or collectively, by various types of electro-mechanical systems having a wide range of electrical components such as hardware, software, firmware, and/or virtually any combination thereof; and a wide range of components that may impart mechanical force or motion such as rigid bodies, spring or torsional bodies, hydraulics, electro-magnetically actuated devices, and/or virtually any combination thereof. Consequently, as used herein “electro-mechanical system” includes, but is not limited to, electrical circuitry operably coupled with a transducer (e.g., an actuator, a motor, a piezoelectric crystal, a Micro Electro Mechanical System (MEMS), etc.), electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of memory (e.g., random access, flash, read only, etc.)), electrical circuitry forming a communications device (e.g., a modem, communications switch, optical-electrical equipment, etc.), and/or any non-electrical analog thereto, such as optical or other analogs. Those skilled in the art will also appreciate that examples of electromechanical systems include but are not limited to a variety of consumer electronics systems, medical devices, as well as other systems such as motorized transport systems, factory automation systems, security systems, and/or communication/computing systems. Those skilled in the art will recognize that electromechanical as used herein is not necessarily limited to a system that has both electrical and mechanical actuation except as context may dictate otherwise.
In a general sense, those skilled in the art will recognize that the various aspects described herein which can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, and/or any combination thereof can be viewed as being composed of various types of “electrical circuitry.” Consequently, as used herein “electrical circuitry” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of memory (e.g., random access, flash, read only, etc.)), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, optical-electrical equipment, etc.). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.
Those skilled in the art will recognize that at least a portion of the devices and/or processes described herein can be integrated into an image processing system. Those having skill in the art will recognize that a typical image processing system generally includes one or more of a system unit housing, a video display device, memory such as volatile or non-volatile memory, processors such as microprocessors or digital signal processors, computational entities such as operating systems, drivers, applications programs, one or more interaction devices (e.g., a touch pad, a touch screen, an antenna, etc.), control systems including feedback loops and control motors (e.g., feedback for sensing lens position and/or velocity; control motors for moving/distorting lenses to give desired focuses). An image processing system may be implemented utilizing suitable commercially available components, such as those typically found in digital still systems and/or digital motion systems.
Those skilled in the art will recognize that at least a portion of the devices and/or processes described herein can be integrated into a data processing system. Those having skill in the art will recognize that a data processing system generally includes one or more of a system unit housing, a video display device, memory such as volatile or non-volatile memory, processors such as microprocessors or digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces300, and applications programs, one or more interaction devices (e.g., a touch pad, a touch screen, an antenna, etc.), and/or control systems including feedback loops and control motors (e.g., feedback for sensing position and/or velocity; control motors for moving and/or adjusting components and/or quantities). A data processing system may be implemented utilizing suitable commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.
Those skilled in the art will recognize that at least a portion of the devices and/or processes described herein can be integrated into a mote system. Those having skill in the art will recognize that a typical mote system generally includes one or more memories such as volatile or non-volatile memories, processors such as microprocessors or digital signal processors, computational entities such as operating systems, user interfaces300, drivers,sensors150, actuators, applications programs, one or more interaction devices (e.g., an antenna USB ports, acoustic ports, etc.), control systems including feedback loops and control motors (e.g., feedback for sensing or estimating position and/or velocity; control motors for moving and/or adjusting components and/or quantities). A mote system may be implemented utilizing suitable components, such as those found in mote computing/communication systems. Specific examples of such components entail such as Intel Corporation's and/or Crossbow Corporation's mote components and supporting hardware, software, and/or firmware.
Those skilled in the art will recognize that it is common within the art to implement devices and/or processes and/or systems, and thereafter use engineering and/or other practices to integrate such implemented devices and/or processes and/or systems into more comprehensive devices and/or processes and/or systems. That is, at least a portion of the devices and/or processes and/or systems described herein can be integrated into other devices and/or processes and/or systems via a reasonable amount of experimentation. Those having skill in the art will recognize that examples of such other devices and/or processes and/or systems might include—as appropriate to context and application—all or part of devices and/or processes and/or systems of (a) an air conveyance (e.g., an airplane, rocket, helicopter, etc.), (b) a ground conveyance (e.g., a car, truck, locomotive, tank, armored personnel carrier, etc.), (c) a building (e.g., a home, warehouse, office, etc.), (d) an appliance (e.g., a refrigerator, a washing machine, a dryer, etc.), (e) a communications system (e.g., a networked system, a telephone system, a Voice over IP system, etc.), (f) a business entity (e.g., an Internet Service Provider (ISP) entity such as Comcast Cable, Qwest, Southwestern Bell, etc.), or (g) a wired/wireless services entity (e.g., Sprint, Cingular, Nextel, etc.), etc.
In certain cases, use of a system or method may occur in a territory even if components are located outside the territory. For example, in a distributed computing context, use of a distributed computing system may occur in a territory even though parts of the system may be located outside of the territory (e.g., relay, server, processor, signal-bearing medium, transmitting computer, receiving computer, etc. located outside the territory). A sale of a system or method may likewise occur in a territory even if components of the system or method are located and/or used outside the territory. Further, implementation of at least part of a system for performing a method in one territory does not preclude use of the system in another territory.
One skilled in the art will recognize that the herein described components (e.g., operations), devices, objects, and the discussion accompanying them are used as examples for the sake of conceptual clarity and that various configuration modifications are contemplated. Consequently, as used herein, the specific exemplars set forth and the accompanying discussion are intended to be representative of their more general classes. In general, use of any specific exemplar is intended to be representative of its class, and the non-inclusion of specific components (e.g., operations), devices, and objects should not be taken limiting.
Those skilled in the art will appreciate that a user600 may be representative of a human user600, a robotic user600 (e.g., computational entity), and/or substantially any combination thereof (e.g., a user may be assisted by one or more robotic agents) unless context dictates otherwise.
With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations are not expressly set forth herein for sake of clarity. The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures may be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components, and/or wirelessly interactable, and/or wirelessly interacting components, and/or logically interacting, and/or logically interactable components.
In some instances, one or more components may be referred to herein as “configured to,” “configurable to,” “operable/operative to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc. Those skilled in the art will recognize that “configured to” can generally encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise. While particular aspects of the present subject matter described herein have been shown and described, it will be apparent to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from the subject matter described herein and its broader aspects and, therefore, the appended claims are to encompass within their scope all such changes and modifications as are within the true spirit and scope of the subject matter described herein. It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to claims containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that typically a disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be typically understood to include the possibilities of “A” or “B” or “A and B.”
With respect to the appended claims, those skilled in the art will appreciate that recited operations therein may generally be performed in any order. Also, although various operational flows are presented in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently. Examples of such alternate orderings may include overlapping, interleaved, interrupted, reordered, incremental, preparatory, supplemental, simultaneous, reverse, or other variant orderings, unless context dictates otherwise. Furthermore, terms like “responsive to,” “related to,” or other past-tense adjectives are generally not intended to exclude such variants, unless context dictates otherwise.