Movatterモバイル変換


[0]ホーム

URL:


US6545685B1 - Method and system for efficient edge blending in high fidelity multichannel computer graphics displays - Google Patents

Method and system for efficient edge blending in high fidelity multichannel computer graphics displays
Download PDF

Info

Publication number
US6545685B1
US6545685B1US09/231,405US23140599AUS6545685B1US 6545685 B1US6545685 B1US 6545685B1US 23140599 AUS23140599 AUS 23140599AUS 6545685 B1US6545685 B1US 6545685B1
Authority
US
United States
Prior art keywords
frame
overlap region
video frame
blended
polygon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US09/231,405
Inventor
Angus Dorbie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
RPX Corp
Morgan Stanley and Co LLC
Original Assignee
Silicon Graphics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Silicon Graphics IncfiledCriticalSilicon Graphics Inc
Priority to US09/231,405priorityCriticalpatent/US6545685B1/en
Assigned to SILICON GRAPHICS, INC.reassignmentSILICON GRAPHICS, INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: DORBIE, ANGUS
Application grantedgrantedCritical
Publication of US6545685B1publicationCriticalpatent/US6545685B1/en
Assigned to WELLS FARGO FOOTHILL CAPITAL, INC.reassignmentWELLS FARGO FOOTHILL CAPITAL, INC.SECURITY AGREEMENTAssignors: SILICON GRAPHICS, INC. AND SILICON GRAPHICS FEDERAL, INC. (EACH A DELAWARE CORPORATION)
Assigned to GENERAL ELECTRIC CAPITAL CORPORATIONreassignmentGENERAL ELECTRIC CAPITAL CORPORATIONSECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: SILICON GRAPHICS, INC.
Assigned to MORGAN STANLEY & CO., INCORPORATEDreassignmentMORGAN STANLEY & CO., INCORPORATEDASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GENERAL ELECTRIC CAPITAL CORPORATION
Assigned to GRAPHICS PROPERTIES HOLDINGS, INC.reassignmentGRAPHICS PROPERTIES HOLDINGS, INC.CHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: SILICON GRAPHICS, INC.
Assigned to RPX CORPORATIONreassignmentRPX CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GRAPHICS PROPERTIES HOLDINGS, INC.
Assigned to JEFFERIES FINANCE LLCreassignmentJEFFERIES FINANCE LLCSECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: RPX CORPORATION
Anticipated expirationlegal-statusCritical
Assigned to RPX CORPORATIONreassignmentRPX CORPORATIONRELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS).Assignors: JEFFERIES FINANCE LLC
Expired - Lifetimelegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A method for implementing edge blending between a first and second video frame to create a seamless multichannel display system. The method is implemented in a graphics computer system including a processor coupled to a memory via a bus. Within the computer system, a first video frame is rendered for display on a first video channel. A second video frame is rendered for display on a second channel. A first overlap region is rendered onto the first frame to obtain a first blended video frame. A second overlap region is blended onto the second frame to obtain a second blended video frame. The first blended video frame from the first channel and the second blended video frame from the second channel are then combined such that the first overlap region and the second overlap region correspond, thereby forming a seamless junction between the first blended frame and the second blended frame and implementing a high fidelity multichannel display.

Description

FIELD OF THE INVENTION
The field of the present invention pertains to computer implemented graphics. More particularly, the present invention relates to a system and method for implementing high fidelity multichannel computer graphics displays.
BACKGROUND OF THE INVENTION
Many computer implemented applications interact with and present information to a user via graphical means. Some of the most compelling such applications involve visual computing/data visualization, whereby information and interaction with the user is accomplished through the use of large graphical displays. For example, applications have been created which graphically depict information stored within a database (e.g., information describing an underground oil field) allow the user to interact with the information in real-time, visually, by “moving” through a “virtual” representation of the information stored within the database (e.g., popularly known as “virtual reality”).
These applications typically require large, high fidelity displays for their most compelling implementation. For example, a powerful visual computing system might include a very large wrap-around screen for graphically presenting data/information to one or more users. The large wrap around screens allow the creation of an extremely realistic, compelling, virtual representation of the information to the one or more users, allowing them to realistically “navigate” through any virtual 3D environment, such as, for example, “walking” through the rooms of a newly designed building, “flying” over a large terrain data base of geo-specific data, or the like.
Accordingly, virtual reality, advanced CAD, and other similar advanced visual computing applications require large high fidelity graphics displays for effective implementation. As is well known in the art, the creation of large graphics-type visual displays in a manageable and cost efficient manner has proven problematic. One commonly used method of generating a large display is to combine multiple (e.g., two or more) smaller screens, or visual channels, into a single larger display. This type of display is referred to as a multichannel display, since the single larger image is created through the combination of two or more smaller images.
Prior art FIG. 1 shows a diagram of a typical largemultichannel display system100.System100 includes three smaller screens (e.g., 20 feet by 30 feet),screen101,screen102, andscreen103, that are combined to form a single very large display (e.g., 20 feet by 90 feet).Areas110 and110 show the junctions between the screens. Screens101-103 function coherently in order to create the large display (e.g., 20 feet by 90 feet) seen by a group ofusers105. In this implementation, screens101-103 are projection type screens, with the images projected from animage projector130.Image projector130 receives video information from animage generator132 via ablender131.
To ensure fidelity, the edges between the channels of screens101-103, “blend regions”110 and111, need to blend seamlessly in order to create the large image. This has proven problematic. As with most arrayed multichannel displays, the edges of the combined channels must be blended such that the “seams” are as unnoticeable as possible. In projection type multichannel displays (e.g., system100), one typical prior art approach is to overlap the edges of the image from each video feed in order to create a smooth transition from the image of one channel to the next, in order to smooth the seams ofareas110 and111. This requires some overlap of the video from each image and requires that the brightness of one image reduces as the other increases in the overlap region,areas110 and111. To maintain high fidelity, the brightness levels of the channels need to be precisely controlled. However, achieving this level of control often requires the incorporation of expensive, dedicated hardware withinimage generator unit130.
Prior art FIG. 2 showssystem100 in greater detail. As depicted in FIG. 2, theimage generator132 includes graphics computers241-243 for generating the video information for each video channel. Computers241-243 are respectively coupled to image blenders251-253, which are in turn coupled to projectors201-203. Projectors201-203 function by projecting the video information received from computers241-243 and image blenders251-253 onto their respective one of screens101-103.
As described above, to maintain high fidelity, the seams ofblend regions110 and111 need to be precisely blended such that they are as unnoticeable as possible.System100 includes dedicated image blenders251-253 for performing the precise brightness control required to implement the seamless overlap of screens101-103. Computers241-243 include graphics processing hardware and software and function by generating the video information for each respective channel for screens101-103. Blenders251-253 perform brightness processing on the video information received from computers241-243.
Withsystem100, and other similar prior art multichannel display systems, the blending function is performed by the dedicated image blenders251-253. Image blenders251-253 are dedicated, special purpose hardware components which process the video information emerging from computers241-243. The image blenders251-253, as is typical with prior art multichannel display systems, are format/hardware implementation specific, in that they are not readily interchangeable among display systems from different manufacturers. Because of the required level of fidelity, image blenders251-253 are relatively complex and expensive, adding significantly to the overall complexity and cost ofsystem100.
Thus, what is required is a method and system for implementing high fidelity blending for multichannel displays without requiring the use of a dedicated blending hardware for post-image generation blending processing. What is required is a system which can be efficiently implemented on multiple computer system platforms. The required system should be inexpensive, and not require additional, dedicated hardware for its implementation.
SUMMARY OF THE INVENTION
The present invention is a method and system for implementing high fidelity blending for multichannel displays without requiring the use of a dedicated blending hardware for processing video information after it emerges from an image generator. The present invention provides a system which can be efficiently implemented on multiple computer system platforms. The system of the present invention is inexpensive, and does not require additional, dedicated hardware for its implementation.
In one embodiment, the present invention is implemented as an edge blending process executed on a graphics computer system included within the image generator. The computer system included within the image generator performs the edge blending processing on the gamma-corrected image before it is taken from the frame buffer or sent to video. This eliminates the need for a separate dedicated blending hardware unit, as required with prior art multichannel display systems. The computer systems included within the image generator perform the edge blending processing on each of the video frames of the channels such that as the video information emerges from the image generator each of the video frames includes the necessary blending required for creating a seamless multichannel display.
The blending process of the present invention occurs as the information is being rendered by the computer systems. For example, As a left computer generated frame is rendered for display on a first video channel, a calculated blend geometry is rendered onto that frame to modulate the brightness in the blend/overlap region, resulting in a first blended frame. This occurs before the frame is sent to video. As a right frame is rendered for display on a second channel, a second complimentary blend geometry is rendered onto the second frame to modulate the brightness of the second frame in the blend/overlap region, resulting in a second blended frame. The overly regions of these blended frames have their respective brightness levels modulated such that when the first and second blended frames are sent to video and projected onto adjacent first and second screens, the resulting images overlap within the overlap regions and combine such that the first and second overlap regions precisely align. In this manner, the image from the first blended video frame and the image from the second blended video frame form a seamless junction between the two, thereby implementing a high fidelity multichannel display.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention is illustrated by way of example and not by way of limitation, in the Figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
Prior art FIG. 1 shows a diagram of a prior art multichannel image generator and display system.
Prior art FIG. 2 shows a diagram of the multichannel image generator and display system of FIG. 1 in greater detail.
FIG. 3 shows a diagram of a multichannel image generator and display system in accordance with one embodiment of the present invention to.
FIG. 4 shows a diagram of the typical components of a graphics computer included within an image generator used in a multichannel display system in accordance with one embodiment of present invention.
FIG. 5 shows a diagram of the relationship between the respective video channels of a multichannel display system in accordance with one embodiment of present invention.
FIG. 6 shows a diagram of the overly regions of the center channel of the multichannel display system of FIG.5.
FIG. 7 shows a pre-blended depiction and a post-blended depiction of the image displayed viachannel2 in accordance with one embodiment of present invention.
FIG. 8 depicts a series of diagrams representative of the brightness level of the image information within the frame buffer of one of the graphics computers in accordance with one embodiment of present invention.
FIG. 9 shows a diagram depicting a multi-polygon rendering method for implementing the overly regions in accordance with one embodiment of present invention.
FIG. 10 shows a diagram depicting the effect on the brightness level of an image as the multiple polygons from FIG. 9 are blended into the image.
FIG. 11 shows a diagram of a one-dimensional texture rendering method for implementing the overly regions in accordance with one embodiment of the present invention.
FIG. 12 shows a diagram depicting one possible brightness profile of an image resulting from the method depicted in FIG.11.
FIG. 13 shows a diagram depicting an alpha mask rendering method for implementing the overlap regions in accordance with one embodiment of present invention.
FIG. 14 shows a diagram depicting an image read back from texture memory method for implementing the overlap regions in accordance with one embodiment of present invention.
FIG. 15 shows a flow chart of the steps of a blending process in accordance with one embodiment of present invention.
FIG. 16 shows a flow chart of the steps of an alternative blending process in accordance with one embodiment of present invention.
DETAILED DESCRIPTION OF THE INVENTION
In the following detailed description of the present invention, a method and system for efficient edge blending in high fidelity multichannel computer graphics displays, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be obvious to one skilled in the art that the present invention may be practiced without these specific details and on systems which do not fully implement the depicted architectural details. In other instances well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the present invention. It should also be recognized by one skilled in the art that some aspects of the present invention pertain to graphics subsystems, and as such, can apply not only to graphics computers, but to any computer based image generator, such as, for example, a simple computer image generator having simple controlling hardware and which connects directly to data base memory or a mass storage device.
Some portions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data bits within a computer system. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. A procedure, logic block, process, step, etc., is here, and generally, conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, optical, or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present invention, discussions utilizing terms such as “rendering” or “sending” or “processing” or “executing” or “storing” or the like, refer to the action and processes of a computer system, or similar data processing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
The present invention is a method and system for implementing high fidelity blending for multichannel displays without requiring the use of a dedicated blending hardware for processing video information after it emerges from image generation. The present invention provides a system which can be efficiently implemented on multiple computer system platforms. The system of the present invention is inexpensive, and does not require additional, dedicated hardware for its implementation. The present invention and its benefits are further described below.
Referring now to FIG. 3, a multichannel display system300 in accordance with one embodiment of the present invention is shown. In the present embodiment, system300 includes three large projection-type screens301-303 (e.g., 20 feet by 30 feet) which combine to form a single very large, high fidelity display (e.g., 20 feet by 90 feet). The large display is capable of presenting, seamless, high fidelity graphics which span the entire area of screens301-303. Fidelity is implemented by achieving a seamless overlap of the images of each of the screens301-303. In the present embodiment, the overlap occurs in a first and second overlap region,region310 andregion311.
Within these overlapregions310 and311, the image from the each of the channels overlaps the image each adjacent channel by a predetermined amount. For example, the images ofchannel1 overlaps the images ofchannel2 inoverlap region310, and likewise forchannel2 andchannel3 inoverlap region311. The brightness levels of the images of the respective channels is precisely determined and controlled such that a seamless junction is created between the images. In accordance with the present invention, the process of determining an controlling the brightness levels is implemented within animage generator unit330, as opposed to using a separate blending unit.
Image generator330 includes three graphics computer systems (e.g., high performance graphics workstations) which generate the computer graphics information for each respective channel. The blending process of the present invention is implemented within the graphics subsystems of the graphics computers. The image information emerging from the graphics computers includes all the necessary blending required foroverlap regions310 and311. Hence, no additional processing need be performed on the information emerging from the graphics computers320-322.
The video information is coupled directly to image projectors301-303, for projection onto screens301-303. Since each video frame has been previously blended within the graphics pipelines ofcomputer systems320322, the images of screens301-303 will blend seamlessly withinoverlap regions310 and311.
Since the blending functionality of the present invention is implemented entirely within the computer systems320-322, the system of the present invention is much more efficient than comparable prior art multichannel display systems. By performing the required blending entirely withinimage generator330, the present invention eliminates the requirement for an external, dedicated blending unit as required in typical prior art systems. This makes system300 less expensive less expensive than comparable prior art systems.
In addition, since the blending functionality of the present invention is implemented within the graphics pipelines of computer systems320-322, the present invention is readily portable. System300 implements high fidelity blending for multichannel displays without requiring the use of a dedicated blending hardware for processing video information after it emerges from image generation (e.g., emerges from graphics computers320-322). Consequently, system300 can be more easily and efficiently implemented on different types of computer system platforms.
Referring now to FIG. 4, a diagram of a graphics computer system (e.g., graphics computer321) in accordance with one embodiment of the present invention is shown.System321 includes any computer graphics systems for generating complex or 3 dimensional images.Computer system321 includes abus401 for transmitting digital information between the various parts of the computer system. One ormore microprocessors402 are coupled tobus401 for processing information. The information along with the instructions of how the information is to be processed are stored in a hierarchical memory system comprised ofmass storage device407, read onlymemory406, andmain memory404.Mass storage device407 is used to store large amounts of digital data. Themass storage device407 can consist one or more hard disk drives, floppy disk drives, optical disk drives, tape drives, CD ROM drives, or any number of other types of storage devices having media for storing data digitally. A read only memory (ROM)406 is used to store digital data of a permanent basis, such as instructions for the microprocessors.Main memory404 is used for storing digital data on an intermediate basis.Main memory404 can be dynamic random access memory (DRAM).
A 3Dgraphics rendering system411 is an option which can be included insystem321.Processor402 provides thegraphics system411 with graphics data, such as drawing Commands, coordinate vertex data, and other data related to an object's geometric position, color, texture, shading, and other surface parameters. The object data is processed bygraphics system411 in the following four pipelined stages:geometry subsystem431,scan conversion subsystem432,raster subsystem433, and avideo subsystem434. Thegeometry subsystem431 converts the graphical data fromprocessor402 into a screen coordinate system. It is the function of thegeometry subsystem431 to perform the projection and transformation process to give depth to a displayed object. The resulting primitives (points, lines, polygons, polyhedra, and the like) supplied by thegeometry subsystem431 are then supplied to thescan conversion subsystem432. It is the function of thescan conversion subsystem432 to then generate pixel data (e.g., fragments, fragment parameters, color information, and the like) based on these primitives. Thescan conversion subsystem432 performs the interpolation functions to interpolate straight lines so that each intermediate value need not be individually and separately calculated by the geometry subsystem. Next, the pixel data is sent to theraster subsystem433, whereupon Z-buffering, blending, texturing (using texture data stored in texture memory436), and antialiasing functions are performed. The resulting pixel values are subsequently stored inframe buffer435, and the Z values are stored in the Z buffer410. Thevideo subsystem434 reads theframe buffer435, gamma corrects the image information stored therein, and sends the image (in the correct format) to themonitor421. The video information is also output fromframe buffer435 toprojector331 for projection ontoscreen302.
Several other optional devices may also be coupled tosystem321. For example, analphanumeric keyboard422 is used for inputting commands and other information toprocessor402. Another type of user input device is cursor control device423 (a mouse, trackball, joystick, and the like) used for positioning a movable cursor and selecting objects on a computer screen. Another device which may be coupled tobus401 is a hard copy device424 (e.g., a laser printer) for printing data or other information onto a tangible medium. Additionally, a sound recording orvideo option425 can be coupled to thesystem321 to provide multimedia capabilities.
Graphics computers320 and322 are each substantially similar tographics computer321. As withgraphics computer321,workstations320 and322 function by generating the image information required for projection onto theirrespective screens301 and303.
With reference now to FIG.5 and FIG. 6, diagrams of the image projected onto the center channel (channel2 on screen302) of the multichannel display are shown. FIG. 5 shows the location ofchannel2 with respect tochannels1 and3. FIG. 6 shows the image ofchannel2 without any overlap fromchannels1 and3.
As described above, the brightness levels of each of the channels (e.g., channels1-3) in their overlap regions is precisely controlled in order to make the junctions between them appear as unnoticeable as possible. In the case ofchannel2, the center channel, the brightness of the image displayed decreases in theregions310 and311. This is so that whenchannel2's image is overlapped on the images ofchannels1 and3, the seam betweenchannels1,2, and3 will not be noticeable. In prior art multichannel display systems, this blending was performed using a separate, discreet, blending processor or device. In accordance with the present invention, this blending is performed within the graphics subsystem as the image is being generated.
FIG. 7 shows a pre-blended depiction and a post-blended depiction of the image displayed viachannel2. The pre-blended depiction is on the left side of FIG.7 and the post-blended depiction is on the right side of FIG.7. Beneath each depiction is a corresponding diagram of the brightness level of the pre-blended and post-blended images, where brightness level is shown in the vertical direction and the horizontal dimension of the respective image is shown in the horizontal direction.
As shown in FIG. 7, the brightness level of the pre-blended image is uniform across the horizontal dimensions of the image. However, after blending, the brightness level is decreased at the far left and far right edges of the image, theoverlap regions310 and311. In a similar manner, the far right brightness level of the image fromchannel1 decreases so that when the two channels are overlapped, the brightness level is uniform across both channels. Likewise, the far left brightness level of the image fromchannel3 decreases so that whenchannels2 and3 are overlapped, the brightness level is uniform across them. In this manner,channels1,2, and3 are blended to create the single large display, as the decreased brightness levels of theoverlap regions310 and311 combine to obtain a nominal brightness level. The blending is determined by and derived from the geometry of the display system. The present invention is not limited to the simple geometry, but can support arbitrary blend shapes as determined by the particular requirements of the display design.
FIG. 8 depicts a series of diagrams representative of the brightness level of the image information within the frame buffer of one of the graphics computers320-322 (e.g., workstation321). The left column of FIG. 8 shows two diagrams (diagram801 and diagram804) representing the brightness level of the contents of the frame buffer. The center column of FIG. 8 shows two diagrams (diagrams802 and805) of the gamma level of the screen (e.g., screen302), and the right column shows two diagrams (diagrams803 and806) of the brightness level of the screen (e.g., display screen302) as actually seen by a user. The three diagrams on the top row of FIG. 8 (diagrams801-803) depicts a case where blending is improperly implemented. The three diagrams on the bottom row FIG. 8 (diagrams804-806) depict a case where, in accordance with present invention, blending is properly implemented.
In accordance with the present embodiment, the blending is performed as the image information is being rendered into the frame buffer of the workstation (e.g., workstation321). Diagram801 and diagram804 represent the brightness level of the post blended image stored within the frame buffer (e.g., frame buffer435). To achieve uniform nominal brightness ondisplay screen302, it is important that the post blended image is processed to take into account the gamma level of thescreen302. The top row of FIG. 8 depicts a case where this is improperly implemented. For example, when the brightness level of the image (e.g., diagram801) combines with the gamma level of the image projector331 (e.g., diagram802), a nonuniform brightness results (e.g., diagram803) since the blending performed on the image rendered into theframe buffer435 did not properly take into account the gamma level of thescreen302. This is for a situation where gamma correction is not supported in thevideo system434 of thegraphics subsystem411.
The bottom row of FIG. 8 depicts a case where the blending is properly implemented. The image is rendered intoframe buffer435 in such a manner as to account for the gamma level of thescreen302. This yields a resulting image onscreen302 having uniform brightness across its area, as shown by diagram806. The present invention is able to take into account the gamma characteristics of each of the projectors330-332. In accordance with the present invention, the images are rendered into the frame buffer with their respective overlap regions blended in consideration of the gamma characteristics of the respective screens. Since the blending process of the present invention occurs as the images are being rendered, the present invention can be easily adjusted in light of the gamma characteristics of any particular screen.
FIG. 9 shows one method in accordance with present invention for generating the blended overlap regions. In this method, the brightness control is achieved by using the graphics processing capabilities of the graphics subsystem (e.g., graphics subsystem411) of the graphics computer (e.g., graphics computer321). A set of polygons are drawn across the overlap regions of a rendered image (e.g., the image rendered into theframe buffer435 that is to be processed and subsequently displayed on one of screens301-302). Each of the polygons has a respective transparency value (e.g., alpha) and a color value. In this embodiment, the process of the present invention blends theimage900 by multiplying the image intensity by the intensity described by the color and/or transparency values of the polygons901-905 and906-910. This has the effect of decreasing the brightness level of theimage900 in the manner required to accomplish the seamless blend (e.g., the post blend brightness level shown in FIG.7). The blend intensity of each of the polygons901-910 can be described in the form of a texture applied to the polygons901-910 or per vertex intensity values.
FIG. 10 shows a diagram depicting the effect on the brightness level of theimage900 as a color of the polygons906-910 are blended in. As shown in FIG. 10, controlling brightness level by blending multiple polygons into the image results in a “piecewise linear” approximation of the desired smooth profile from the normal brightness level of theimage900 to the fully dark brightness level of the image, on the far right side of the overlap region, shown bypolygon910. The desired “brightness profile” of the overlap regions is controlled by adjusting the color/transparency of the multiple polygons, and by increasing or decreasing the number of polygons used to create the brightness profile.
FIG. 11 shows a second method in accordance with the present invention for generating the blended overlap regions using polygons. This method is different from the method shown in FIG. 9 in that instead of using many different polygons (e.g., polygons901-905 and polygons906-910) to create the overlap regions, a single polygon is used for each overlap region, polygon1101 andpolygon1102.Polygons1101 and1102 both have a respective one-dimensional texture that is used to control their respective brightness. The one-dimensional texture includes a number of samples that describe the color/transparency of the texture. Hence, as the one-dimensional texture is mapped onto, for example,polygon1102, the brightness of theunderlying image1100 decreases in accordance with the brightness profile described by the one-dimensional texture. This is depicted in FIG. 12 below.
FIG. 12 shows the brightness profile of thepolygon1102 as described by the samples, n, of the one-dimensional texture that is mapped ontopolygon1102. This brightness profile would be substantially similar to the brightness profile shown in FIG. 10, however, the brightness profile of FIG. 12 is more “continuous” since a larger number of samples are used to describe the profile and more complex shapes could be efficiently represented. This method tends to emphasize the use of the texture hardware of graphics subsystem411 while the method of FIG. 9 tends to emphasize the use of the geometry and shading hardware ofgraphics subsystem411.
FIG. 13 shows a third method in accordance with present invention for generating the blended overlap regions by using an alpha mask. FIG. 13 depicts analpha mask1301 which is stored into theframe buffer435 prior to the rendering of an image.Alpha mask1301 includes a unique blend modulation value for each pixel in theframe buffer435. These values are used to create the desired brightness profile for the image when the image is later rendered into theframe buffer435. When the scene is later rendered, each pixel written to the frame buffer is multiplied by the alpha brightness already stored within theframe buffer435. This results in an image with a correctly modulated brightness intensity, ready to be sent toprojector331.
One advantage of using alpha masks to implement the overlap regions is that the alpha mask readily allow control of the contour or outline of the boundaries of the overlap regions. For example, as opposed to implementing overlap regions having “regular” straight boundaries (e.g., as shown in FIG. 13) the alpha masks can be configured to implement irregularly shaped overlap regions having irregularly shaped boundaries. This helps to obscure the junction between the channels of the display system, reducing the degree to which the seam is noticeable.
Additionally, although the discussions of the present invention in FIGS. 3 through 13 show the invention functioning with 3 screens horizontally aligned, it should be appreciated that the blending process of the present invention can be applied to multichannel displays having any number of screens arrayed in any number of configuration (e.g., tiled, stacked, etc.).
FIG. 14 shows a fourth method for generating the blended overlap regions in accordance with one embodiment of the present invention. In this method, all or parts of the image in theframe buffer435 are copied back totexture memory436 withingraphics subsystem411. This image information constitutes data which can be applied as a texture on to one or more polygons rendered into theframe buffer435, where the underlying color of the polygon or polygons is used to modulated the color of the “texture” from texture memory436 (where this texture comprises the image that was read fromframe buffer435 to texture memory436). The left side of FIG. 14 shows an image mapped onto a polygon mesh prior to subdivision and its corresponding pre-blend brightness profile. The right side of FIG. 14 shows the image mapped onto the polygon mesh after the mesh has been subdivided in the overlap regions and its corresponding post-blend brightness profile. In this case, the overlap regions are on each edge of the image (e.g., top, bottom, left, and right).
In cases where there is no nonlinear image mapping, only those images which are undergoing a blend are read back fromframe buffer435 totexture memory436. Polygons are subsequently rendered into the frame buffer over the regions the image was read from. These polygons can then be subdivided to further increase their discreetness across the overlap or blend regions. The suitably discrete polygons are then textured with the image information fromtexture memory436. Depending upon their respective location within the blend regions, the polygons are assigned different colors such that when they are textured with the image information, the resulting image has its brightness level appropriately modulated within the overlap regions of the image. If the underlying image (e.g., image1100) is suitably discrete across the overlap region, values of the n samples (e.g., blend luminance multipliers) may be specified at a per vertex level, resulting in the texture intensity being modulated on application to the frame buffer. Additionally, it should be noted that this method is similar to distortion correction solutions and other solutions where an image in texture memory is applied to an image stored within the frame buffer using a texturing operation, such as in, for example, video playback or texture read back, antialiasing, and the like.
In the prior art, techniques similar to the above are used for nonlinear image distortion correction, whereby an image rendered into the frame buffer is read back into texture memory and then subsequently texture mapped onto a polygon mesh rendered into the frame buffer. This polygon mesh has geometry characteristics which are designed to compensate for distortion caused by, for example, and irregularly shaped screen (e.g., a “wraparound” screen). Texture mapping the image onto this polygon mesh and then sending be resulting image to a projector for display corrects the distortion caused by the irregularly shaped screen. In accordance with the present invention, this technique is used to apply the blending for properly implementing overlap regions as required by a multichannel display.
With reference now to FIG. 15, a flow chart of the steps of aprocess1500 in accordance with one embodiment of the present invention is shown.Process1500 depicts the steps of a blending process used by system in accordance with present invention (e.g., system300), where polygons are used to implement overlap regions, as the image information is being generated within an image generator (e.g., image generator330).
Process1500 begins instep1501, where the graphics computers (e.g., graphics computers320-322) within theimage generator330 execute graphics applications which generate the images for display. Instep1502, the graphics application executing on the processors of the workstations (e.g., graphics computer321) renders an image into the frame buffer (e.g., frame buffer435) of the graphics subsystem (e.g., graphics subsystem411). Instep1503, the image stored in theframe buffer435 is processed to create the necessary overlap regions (e.g., overlap regions310-311). As described above in the discussion of FIGS. 9 through 12, polygons are rendered into the frame buffer “over” the image stored into frame buffer in order to create the desired brightness profile. As discussed in FIGS. 9 in10, a series of polygons can be rendered over the image stored in the frame buffer, where each polygon has a different color/transparency value, in order to implement the desired brightness profile. Alternatively, as discussed in FIGS. 11 and 12, a one-dimensional texture having a plurality of color/transparency samples can be mapped onto a single polygon, and that single polygon be used to implement and overlap region on top of the image stored within the frame buffer.
With reference still to FIG. 15, instep1504, the blended image in theframe buffer435 is converted into video format and sent to a video projector (e.g., image projector331). Instep1505, the blended image is then projected byimage projector321 onto the respective screen (e.g., screen302).Process1500 is performed by each of the graphics computers withinimage generator330, resulting in a seamless blended multichannel display for viewing by the user. As shown bystep1506, this process continues as the graphics application continues to function.
With reference now to FIG. 16, a flow chart of steps of aprocess1600 in accordance with an alternate embodiment of the present invention is shown.Process1600 is substantially similar toprocess1500 of FIG.15. However, andprocess1600, as opposed to using polygons to implement the overlap regions310-311, an alpha mask (e.g.,alpha mask1301 from FIG. 13) is used.
Process1600 begins instep1601, where a graphics application is executed. Instep1602, the graphics application sends image information to thegraphics subsystem411. Instep1602, as described in the discussion of FIG. 13, an alpha mask is rendered into theframe buffer435 of thegraphics subsystem411. Instep1603, and image from the graphics application is then rendered into the frame buffer of the graphics subsystem, “on top of” the alpha mask. As described above, this process results in the image being blended within theoverlap regions310 as required. Instep1604, the resulting blended image is then converted to video format and sent to theimage projector331. Instep1605, the blended image is projected ontoscreen302 by theimage projector331. As shown bystep1606,process1600 continues as the graphics application continues to run, thereby implementing the seamless multichannel image for viewing by the user.
Thus, the present invention provides method for implementing edge blending between a multiple video channels to create a seamless multichannel display system. The present invention provides a method and system for implementing high fidelity blending for multichannel displays without requiring the use of a dedicated blending hardware for processing video information after it emerges from image generation. The present invention provides a system which can be efficiently implemented on multiple computer system platforms. Additionally, the system of the present invention is inexpensive, and does not require additional, dedicated hardware for its implementation.
The foregoing descriptions of specific embodiments of the present invention have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the Claims appended hereto and their equivalents.

Claims (32)

What is claimed is:
1. In a multichannel display system, a method for implementing edge blending between a first and second video frame to create a seamless multichannel display system, comprising:
a) rendering a first video frame for display on a first channel;
b) rendering a second video frame for display on a second channel;
c) rendering a first overlap region onto the first video frame to obtain a first blended video frame;
d) rendering a second overlap region onto the second video frame to obtain a second blended video frame, wherein the first overlap region and the second overlap region are implemented by producing a first polygon to define the first overlap region and a second polygon to define the second overlap region, and mapping a first texture over the first polygon and a second texture over the second polygon such that the first and second textures modulate a brightness level within the first and second overlap regions; and
e) combining the first blended video frame from the first channel and the second blended video frame from the second channel such that the first overlap region and the second overlap region correspond, thereby forming a seamless junction between the first blended video frame and the second blended video frame.
2. The method ofclaim 1, wherein the first video frame is rendered using a first computer system coupled to provide video information to the first channel, and the second video frame is rendered using a second computer system coupled to provide video information to the second channel.
3. The method ofclaim 1, wherein the first overlap region modulates a brightness level of the first video frame and the second overlap region modulate a brightness level of the second video frame such that the first blended video frame and the second blended video frame combine to form the seamless junction there between.
4. The method ofclaim 1, wherein the first polygon is a first plurality of polygons and the second polygon is a second plurality of polygons.
5. The method ofclaim 4, wherein the plurality of polygons have respective transparency values operable for modulating the brightness level within the first and second overlap regions.
6. The method ofclaim 4, wherein the plurality of polygons have respective color values operable for modulating the brightness level within the first and second overlap regions.
7. The method ofclaim 1, wherein the first and second textures include a plurality of colored samples operable for modulating the brightness level within the first and second overlap regions.
8. The method ofclaim 1, wherein a first transparency mask is stored within a first frame buffer and a second transparency mask is stored within a second frame buffer, the first and second transparency masks operable to modulate the brightness level within the first and second overlap regions as the first and second video frames are respectively rendered into the first and second frame buffers.
9. In a multichannel display system including at least a first computer system for generating video information for a first channel and a second computer system for generating video information for a second channel, a method for implementing edge blending between the first and second channels to create a seamless multichannel display, the method comprising the computer implemented steps of:
a) using the first computer system, rendering a first video frame for display on the first channel;
b) using the second computer system, rendering a second video frame for display on the second channel;
c) rendering a first overlap region onto the first video frame to obtain a first blended video frame within a frame buffer of the first computer system;
d) rendering a second overlap region onto the second video frame to obtain a second blended video frame within a frame buffer of the second computer system, wherein the first overlap region and the second overlap region are implemented by producing a first polygon to define the first overlap region and a second polygon to define the second overlap region, and mapping a first texture having a first color sample over the first polygon and a second texture having a second color sample over the second polygon such that the first and second textures modulate a brightness level within the first and second overlap regions;
e) projecting the first blended video frame from the first channel onto a first screen; and
f) projecting the second blended video frame from the second channel onto a second screen, the first and second screens being directly adjacent, such that the first overlap region and the second overlap region correspond to form a seamless junction between the first blended video frame and the second blended video frame whereby the use of dedicated blending hardware external to one or more of the first computer system and the second computer system is not required.
10. The method ofclaim 9, wherein the first overlap region modulates a brightness level of the first video frame and the second overlap region modulates a brightness level of the second video frame such that the first blended video frame and the second blended video frame combine to form the seamless junction there between.
11. The method ofclaim 9, wherein the first polygon is a first plurality of polygons and the second polygon is a second plurality of polygons.
12. The method ofclaim 11, wherein the plurality of polygons have respective transparency values operable for modulating the brightness level within the first and second overlap regions.
13. The method ofclaim 11, wherein the plurality of polygons have respective color values operable for modulating the brightness level within the first and second overlap regions.
14. The method ofclaim 9, wherein the first colored sample is a first plurality of colored samples and the second colored sample is a second plurality of colored samples.
15. The method ofclaim 9, wherein a first transparency mask is stored within the frame buffer of the first computer system and a second transparency mask is stored within the frame buffer of the second computer system, the first and second transparency masks operable to modulate the brightness level within the first and second overlap regions as the first and second video frames are respectively rendered into the first and second frame buffers.
16. A multichannel display system, comprising:
a first computer system for generating video information coupled to a first channel; and
a second computer system for generating video information coupled to a second channel, the first and second computer systems including respective processors coupled to respective memories, wherein the first and second computer systems execute software causing the first and second computer systems to implement a method for edge blending between the first and second channels to create a seamless multichannel display, the method comprising the computer implemented steps of:
a) using the first computer system, rendering a first frame for display on the first channel;
b) using the second computer system, rendering a second frame for display on the second channel;
c) rendering a first overlap region onto the first frame to obtain a first blended frame within a frame buffer of the first computer system;
d) rendering a second overlap region onto the second frame to obtain a second blended frame within a frame buffer of the second computer system, wherein the first overlap region and the second overlap region are implemented by producing a first polygon to define the first overlap region and a second polygon to define the second overlap region, and mapping a first texture having a first color sample over the first polygon and a second texture having a second color sample over the second polygon such that the first and second textures modulate a brightness level within the first and second overlap regions;
e) outputting the first blended frame from the first channel to a first projector for projection onto a first screen; and
f) outputting the second blended frame from the second channel to a second projector for projection onto a second screen, the first and second screens being directly adjacent, such that the first overlap region and the second overlap region correspond to form a seamless junction between the first blended video frame and the second blended video frame whereby the use of dedicated blending hardware external to one or more of the first computer system and the second computer system is not required.
17. The system ofclaim 16, wherein the first overlap region modulates a brightness level of the first frame and the second overlap region modulates a brightness level of the second frame such that the first blended video frame and the second blended video frame combine to form the seamless junction there between.
18. The system ofclaim 16, wherein the first polygon is a first plurality of polygons and the second polygon is a second plurality of polygons.
19. The system ofclaim 18, wherein the plurality of polygons have respective transparency values operable for modulating the brightness level within the first and second overlap regions.
20. The system ofclaim 18, wherein the plurality of polygons have respective color values operable for modulating the brightness level within the first and second overlap regions.
21. The system ofclaim 16, wherein the first colored sample is a first plurality of colored samples and the second colored sample is a second plurality of colored samples.
22. The system ofclaim 16, wherein a first transparency mask is stored within the frame buffer of the first computer system and a second transparency mask is stored within the frame buffer of the second computer system, the first and second transparency masks operable to modulate the brightness level within the first and second overlap regions as the first and second frames are respectively rendered into the first and second frame buffers.
23. The system ofclaim 22, wherein the first and second transparency masks are configured to implement irregularly shaped first and second overlap regions, thereby rendering the junction between the first and second blended frames less noticeable.
24. The system ofclaim 16, wherein the first overlap region and the second overlap region are implemented by storing the first frame into a texture memory of the first computer system and the second frame into a texture memory of the second computer system, the first frame and the second frame subsequently texture mapped onto a first polygon mesh, including the first polygon, in the frame buffer of the first computer system and a second polygon mesh, including the second polygon, in the frame buffer of the second computer system.
25. The system ofclaim 24, wherein the first polygon mesh and the second polygon mesh include a plurality of polygons having a respective color such that the first overlap region and the second overlap region are implemented when the first frame and the second frame are texture mapped onto the first polygon mesh and the second polygon mesh.
26. The system ofclaim 24, wherein the first polygon mesh and the second polygon mesh are subdivided within the first overlap region and the second overlap region to implement the edge blending.
27. In a multichannel display system, a method for implementing edge blending between a first and second video frame to create a seamless multichannel display system, comprising:
storing a first transparency mask, having a first color sample, within a first frame buffer;
rendering a first video frame into the first frame buffer for display on a first channel, the first transparency mask operable to modulate the brightness level to implement a first overlap region;
storing a second transparency mask, having a second color sample, within a second frame buffer;
rendering a second video frame into the second frame buffer for display on a second channel, the second transparency mask operable to modulate the brightness level to implement a second overlap region; and
projecting the first channel and the second channel such that the first overlap region and the second overlap region correspond, thereby forming a seamless junction between the first blended video frame and the second blended video frame.
28. The method ofclaim 27 further comprising:
implementing the first transparency mask using a first texture map; and
implementing the second transparency mask using a second texture map.
29. The method ofclaim 28, wherein the first and second transparency masks are configured to implement irregularly shaped first and second overlap regions, thereby reducing visibility of a junction between the first and second channels.
30. The method ofclaim 29 further comprising:
rendering the first video frame into the first frame with the first transparency mask using a texture mapping unit of a graphics subsystem; and
rendering the second video frame into the second frame with the second transparency mask using the texture mapping unit of the graphics subsystem.
31. In a multichannel display system, a method for implementing edge blending between a first and second video frame to create a seamless multichannel display system, comprising:
a) rendering a first video frame for display on a first channel;
b) rendering a second video frame for display on a second channel;
c) rendering a first overlap region onto the first video frame to obtain a first blended video frame;
d) rendering a second overlap region onto the second video frame to obtain a second blended video frame, wherein the first overlap region and the second overlap region are implemented by producing a first polygon to define the first overlap region and a second polygon to define the second overlap region, and mapping a first texture having a first color sample over the first polygon and a second texture having a second color sample over the second polygon such that the first and second textures modulate a brightness level within the first and second overlap regions; and
e) combining the first blended video frame from the first channel and the second blended video frame from the second channel such that the first overlap region and the second overlap region correspond, thereby forming a seamless junction between the first blended video frame and the second blended video frame.
32. A multichannel display system, comprising:
a first computer system capable of rendering a first frame and a first overlap region onto the first frame to obtain a first blended frame within a first frame buffer for generating video information coupled to a first channel;
a second computer system capable of rendering a second frame and a second overlap region onto the second frame to obtain a second blended frame within a second frame buffer for generating video information coupled to a second channel for projection by a second projector onto a second screen;
a first projector capable of projecting the first blended frame onto a first screen; and
a second projector capable of projecting the second blended frame onto a second screen;
wherein:
the first and second screens are directly adjacent, such that the first overlap region and the second overlap region correspond to form a seamless junction between the first blended video frame and the second blended video frame whereby the use of dedicated blending hardware external to one or more of the first computer system and the second computer system is not required; and
the first overlap region and the second overlap region are implemented by producing a first polygon to define the first overlap region and a second polygon to define the second overlap region, and mapping a first texture having a first color sample over the first polygon and a second texture having a second color sample over the second polygon such that the first and second textures modulate a brightness level within the first and second overlap regions.
US09/231,4051999-01-141999-01-14Method and system for efficient edge blending in high fidelity multichannel computer graphics displaysExpired - LifetimeUS6545685B1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US09/231,405US6545685B1 (en)1999-01-141999-01-14Method and system for efficient edge blending in high fidelity multichannel computer graphics displays

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US09/231,405US6545685B1 (en)1999-01-141999-01-14Method and system for efficient edge blending in high fidelity multichannel computer graphics displays

Publications (1)

Publication NumberPublication Date
US6545685B1true US6545685B1 (en)2003-04-08

Family

ID=22869112

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US09/231,405Expired - LifetimeUS6545685B1 (en)1999-01-141999-01-14Method and system for efficient edge blending in high fidelity multichannel computer graphics displays

Country Status (1)

CountryLink
US (1)US6545685B1 (en)

Cited By (101)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20020024612A1 (en)*2000-08-302002-02-28Takaaki GyotenVideo projecting system
US20020067864A1 (en)*2000-11-152002-06-06Masatoshi MatsuhiraImage processing device and image processing method
US20020158877A1 (en)*2000-11-222002-10-31Guckenberger Ronald JamesShadow buffer control module method and software construct for adjusting per pixel raster images attributes to screen space and projector features for digital wrap, intensity transforms, color matching, soft-edge blending and filtering for multiple projectors and laser projectors
US20020180727A1 (en)*2000-11-222002-12-05Guckenberger Ronald JamesShadow buffer control module method and software construct for adjusting per pixel raster images attributes to screen space and projector features for digital warp, intensity transforms, color matching, soft-edge blending, and filtering for multiple projectors and laser projectors
US20020196369A1 (en)*2001-06-012002-12-26Peter RiederMethod and device for displaying at least two images within one combined picture
US20040169724A1 (en)*2002-12-092004-09-02Ekpar Frank EdughomMethod and apparatus for creating interactive virtual tours
US20050078118A1 (en)*2000-05-122005-04-14Microsoft CorporationTable indexing system and method
US6940504B1 (en)*2000-11-212005-09-06Microsoft CorporationRendering volumetric fog and other gaseous phenomena using an alpha channel
US20050287449A1 (en)*2004-06-282005-12-29Geert MatthysOptical and electrical blending of display images
US20060007406A1 (en)*2002-10-212006-01-12Sean AdkinsEquipment, systems and methods for control of color in projection displays
US20060092175A1 (en)*2000-11-212006-05-04Microsoft CorporationRendering Volumetric Fog and Other Gaseous Phenomena
US20060104541A1 (en)*2004-11-152006-05-18Baker Henry HMethods and systems for producing seamless composite images without requiring overlap of source images
US20060209057A1 (en)*2005-03-152006-09-21Niranjan Damera-VenkataProjection of overlapping sub-frames onto a surface
US7193654B2 (en)*2000-07-032007-03-20Imax CorporationEquipment and techniques for invisible seaming of multiple projection displays
US20070103652A1 (en)*2003-12-302007-05-10Nijim Yousef WSystem and method for smoothing seams in tiled displays
US7224411B2 (en)2000-03-312007-05-29Imax CorporationDigital projection equipment and techniques
US20070188719A1 (en)*2006-02-152007-08-16Mersive Technologies, LlcMulti-projector intensity blending system
US20070195285A1 (en)*2006-02-152007-08-23Mersive Technologies, LlcHybrid system for multi-projector geometry calibration
US20070242240A1 (en)*2006-04-132007-10-18Mersive Technologies, Inc.System and method for multi-projector rendering of decoded video data
US20070268306A1 (en)*2006-04-212007-11-22Mersive Technologies, Inc.Image-based parametric projector calibration
US20070273795A1 (en)*2006-04-212007-11-29Mersive Technologies, Inc.Alignment optimization in image display systems employing multi-camera image acquisition
US7336277B1 (en)*2003-04-172008-02-26Nvidia CorporationPer-pixel output luminosity compensation
US20080106653A1 (en)*2006-11-062008-05-08Harris Scott C Spatial Light Modulator Techniques for Stage Lighting
US20080129967A1 (en)*2006-04-212008-06-05Mersive Technologies, Inc.Projector operation through surface fitting of 3d measurements
US20080180467A1 (en)*2006-04-132008-07-31Mersive Technologies, Inc.Ultra-resolution display technology
US20080266321A1 (en)*2007-04-302008-10-30Richard AufrancSystem and method for masking and overlaying images in multiple projector system
US20090024448A1 (en)*2007-03-292009-01-22Neurofocus, Inc.Protocol generator and presenter device for analysis of marketing and entertainment effectiveness
US20090024449A1 (en)*2007-05-162009-01-22Neurofocus Inc.Habituation analyzer device utilizing central nervous system, autonomic nervous system and effector system measurements
US20090036756A1 (en)*2007-07-302009-02-05Neurofocus, Inc.Neuro-response stimulus and stimulus attribute resonance estimator
US20090036755A1 (en)*2007-07-302009-02-05Neurofocus, Inc.Entity and relationship assessment and extraction using neuro-response measurements
US20090083129A1 (en)*2007-09-202009-03-26Neurofocus, Inc.Personalized content delivery using neuro-response priming data
US20090135200A1 (en)*2005-06-282009-05-28Mark Alan SchultzSelective Edge Blending Based on Displayed Content
US20090167949A1 (en)*2006-03-282009-07-02David Alan CasperMethod And Apparatus For Performing Edge Blending Using Production Switchers
US20090262260A1 (en)*2008-04-172009-10-22Mersive Technologies, Inc.Multiple-display systems and methods of generating multiple-display images
US20100079489A1 (en)*2008-10-012010-04-01Ati Technologies UlcSystem and method for efficient digital video composition
CN101727878A (en)*2009-12-152010-06-09王晓年Single host projection integration system and realizing method thereof
US20100183279A1 (en)*2009-01-212010-07-22Neurofocus, Inc.Methods and apparatus for providing video with embedded media
US20100186031A1 (en)*2009-01-212010-07-22Neurofocus, Inc.Methods and apparatus for providing personalized media in video
US20100195003A1 (en)*2009-02-042010-08-05Seiko Epson CorporationProjector, projection system, image display method, and image display program
US20100249636A1 (en)*2009-03-272010-09-30Neurofocus, Inc.Personalized stimulus placement in video games
US20100249538A1 (en)*2009-03-242010-09-30Neurofocus, Inc.Presentation measure using neurographics
US20100309336A1 (en)*2009-06-052010-12-09Apple Inc.Skin tone aware color boost for cameras
US7859542B1 (en)2003-04-172010-12-28Nvidia CorporationMethod for synchronizing graphics processing units
US20110046504A1 (en)*2009-08-202011-02-24Neurofocus, Inc.Distributed neuro-response data collection and analysis
US20110078762A1 (en)*2009-09-292011-03-31Ebay, Inc.Mobile or user device authentication and tracking
US20110106621A1 (en)*2009-10-292011-05-05Neurofocus, Inc.Intracluster content management using neuro-response priming data
US20110119124A1 (en)*2009-11-192011-05-19Neurofocus, Inc.Multimedia advertisement exchange
US20110119129A1 (en)*2009-11-192011-05-19Neurofocus, Inc.Advertisement exchange using neuro-response data
US20110128294A1 (en)*2009-11-272011-06-02Canon Kabushiki KaishaImage processing apparatus and image processing method
US20110237971A1 (en)*2010-03-252011-09-29Neurofocus, Inc.Discrete choice modeling using neuro-response data
US8251512B2 (en)2004-07-082012-08-28Imax CorporationEquipment and methods for the display of high resolution images using multiple projection displays
US8386313B2 (en)2007-08-282013-02-26The Nielsen Company (Us), LlcStimulus placement system using subject neuro-response measurements
US8386312B2 (en)2007-05-012013-02-26The Nielsen Company (Us), LlcNeuro-informatics repository system
US8392255B2 (en)2007-08-292013-03-05The Nielsen Company (Us), LlcContent based selection and meta tagging of advertisement breaks
US8392253B2 (en)2007-05-162013-03-05The Nielsen Company (Us), LlcNeuro-physiology and neuro-behavioral based stimulus targeting system
US8392250B2 (en)2010-08-092013-03-05The Nielsen Company (Us), LlcNeuro-response evaluated stimulus in virtual reality environments
US8392251B2 (en)2010-08-092013-03-05The Nielsen Company (Us), LlcLocation aware presentation of stimulus material
US8392254B2 (en)2007-08-282013-03-05The Nielsen Company (Us), LlcConsumer experience assessment system
US8396744B2 (en)2010-08-252013-03-12The Nielsen Company (Us), LlcEffective virtual reality environments for presentation of marketing materials
US20130120812A1 (en)*2008-12-242013-05-16Samsung Electronics Co., Ltd.Image processing apparatus and method of controlling the same
US8494905B2 (en)2007-06-062013-07-23The Nielsen Company (Us), LlcAudience response analysis using simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI)
US8494610B2 (en)2007-09-202013-07-23The Nielsen Company (Us), LlcAnalysis of marketing and entertainment effectiveness using magnetoencephalography
US20130258289A1 (en)*2012-03-292013-10-03Seiko Epson CorporationProjection type display device, display system, and display method
US8635105B2 (en)2007-08-282014-01-21The Nielsen Company (Us), LlcConsumer experience portrayal effectiveness assessment system
US8655437B2 (en)2009-08-212014-02-18The Nielsen Company (Us), LlcAnalysis of the mirror neuron system for evaluation of stimulus
US8655428B2 (en)2010-05-122014-02-18The Nielsen Company (Us), LlcNeuro-response data synchronization
US8989835B2 (en)2012-08-172015-03-24The Nielsen Company (Us), LlcSystems and methods to gather and analyze electroencephalographic data
KR20150066936A (en)*2013-12-092015-06-17씨제이씨지브이 주식회사Method for image correction at ovelapped region of image, computer readable medium and executing device thereof
US20150276142A1 (en)*2014-03-252015-10-01Nichia CorporationDisplay system
US20150288921A1 (en)*2012-04-302015-10-08Hewlett-Packard Development Company, L.P.System and method for providing a two-way interactive 3d experience
US9282335B2 (en)2005-03-152016-03-08Hewlett-Packard Development Company, L.P.System and method for coding image frames
US20160094821A1 (en)*2014-09-252016-03-31Canon Kabushiki KaishaProjection type image display apparatus and control method therefor
US9320450B2 (en)2013-03-142016-04-26The Nielsen Company (Us), LlcMethods and apparatus to gather and analyze electroencephalographic data
US9357240B2 (en)2009-01-212016-05-31The Nielsen Company (Us), LlcMethods and apparatus for providing alternate media for video decoders
US9424809B1 (en)2013-07-152016-08-23Google Inc.Patterned projection with multi-panel display
CN105940443A (en)*2013-07-252016-09-14英派尔科技开发有限公司Composite display with multiple imaging attributes
US9454646B2 (en)2010-04-192016-09-27The Nielsen Company (Us), LlcShort imagery task (SIT) research method
US9560984B2 (en)2009-10-292017-02-07The Nielsen Company (Us), LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US9569986B2 (en)2012-02-272017-02-14The Nielsen Company (Us), LlcSystem and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
US9622702B2 (en)2014-04-032017-04-18The Nielsen Company (Us), LlcMethods and apparatus to gather and analyze electroencephalographic data
US9886981B2 (en)2007-05-012018-02-06The Nielsen Company (Us), LlcNeuro-feedback based stimulus compression device
US9910275B2 (en)*2015-05-182018-03-06Samsung Electronics Co., Ltd.Image processing for head mounted display devices
US9936250B2 (en)2015-05-192018-04-03The Nielsen Company (Us), LlcMethods and apparatus to adjust content presented to an individual
US10146304B2 (en)2014-11-102018-12-04Irisvision, Inc.Methods and apparatus for vision enhancement
CN110225323A (en)*2019-06-042019-09-10成都信息工程大学High-precision multi-projector image entirety and edge brightness fusing device
US10496352B2 (en)2015-03-032019-12-03Aten International Co., Ltd.Calibration system and method for multi-image output system
US10963999B2 (en)2018-02-132021-03-30Irisvision, Inc.Methods and apparatus for contrast sensitivity compensation
US10987015B2 (en)2009-08-242021-04-27Nielsen Consumer LlcDry electrodes for electroencephalography
US11094139B2 (en)*2018-06-262021-08-17Quantificare S.A.Method and device to simulate, visualize and compare surface models
US11144119B2 (en)2015-05-012021-10-12Irisvision, Inc.Methods and systems for generating a magnification region in output video images
US11284054B1 (en)2018-08-302022-03-22Largo Technology Group, LlcSystems and method for capturing, processing and displaying a 360° video
US11372479B2 (en)2014-11-102022-06-28Irisvision, Inc.Multi-modal vision enhancement system
US11481788B2 (en)2009-10-292022-10-25Nielsen Consumer LlcGenerating ratings predictions using neuro-response data
US20220360839A1 (en)*2021-05-052022-11-10Disney Enterprises, Inc.Accessibility Enhanced Content Delivery
US11546527B2 (en)2018-07-052023-01-03Irisvision, Inc.Methods and apparatuses for compensating for retinitis pigmentosa
US11704681B2 (en)2009-03-242023-07-18Nielsen Consumer LlcNeurological profiles for market matching and stimulus presentation
US20230331162A1 (en)*2022-04-132023-10-19Panasonic Intellectual Property Management Co., Ltd.Display controller
US20230419437A1 (en)*2016-09-302023-12-28Qualcomm IncorporatedSystems and methods for fusing images
US12205211B2 (en)2021-05-052025-01-21Disney Enterprises, Inc.Emotion-based sign language enhancement of content
US12388973B2 (en)2018-08-302025-08-12Hermelo MirandaSystems and method for capturing, processing, and displaying a 360° video
WO2025189366A1 (en)*2024-03-132025-09-18Qualcomm IncorporatedDisplay processing unit extension for very large aspect ratio automotive display screen

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4974073A (en)*1988-01-141990-11-27Metavision Inc.Seamless video display
US5136390A (en)*1990-11-051992-08-04Metavision CorporationAdjustable multiple image display smoothing method and apparatus
US5896136A (en)*1996-10-301999-04-20Hewlett Packard CompanyComputer graphics system with improved blending
US6126548A (en)*1997-10-082000-10-03Illusion, Inc.Multi-player entertainment system
US6184903B1 (en)*1996-12-272001-02-06Sony CorporationApparatus and method for parallel rendering of image pixels

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4974073A (en)*1988-01-141990-11-27Metavision Inc.Seamless video display
US5136390A (en)*1990-11-051992-08-04Metavision CorporationAdjustable multiple image display smoothing method and apparatus
US5896136A (en)*1996-10-301999-04-20Hewlett Packard CompanyComputer graphics system with improved blending
US6184903B1 (en)*1996-12-272001-02-06Sony CorporationApparatus and method for parallel rendering of image pixels
US6126548A (en)*1997-10-082000-10-03Illusion, Inc.Multi-player entertainment system

Cited By (192)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7224411B2 (en)2000-03-312007-05-29Imax CorporationDigital projection equipment and techniques
US20050078118A1 (en)*2000-05-122005-04-14Microsoft CorporationTable indexing system and method
US7193654B2 (en)*2000-07-032007-03-20Imax CorporationEquipment and techniques for invisible seaming of multiple projection displays
US20020024612A1 (en)*2000-08-302002-02-28Takaaki GyotenVideo projecting system
US6753923B2 (en)*2000-08-302004-06-22Matsushita Electric Industrial Co., Ltd.Video projecting system
US20020067864A1 (en)*2000-11-152002-06-06Masatoshi MatsuhiraImage processing device and image processing method
US6898332B2 (en)*2000-11-152005-05-24Seiko Epson CorporationImage processing device and image processing method
US20060092175A1 (en)*2000-11-212006-05-04Microsoft CorporationRendering Volumetric Fog and Other Gaseous Phenomena
US6940504B1 (en)*2000-11-212005-09-06Microsoft CorporationRendering volumetric fog and other gaseous phenomena using an alpha channel
US7245301B2 (en)2000-11-212007-07-17Microsoft CorporationRendering volumetric fog and other gaseous phenomena
US7227555B2 (en)2000-11-212007-06-05Microsoft CorporationRendering volumetric fog and other gaseous phenomena
US7046243B1 (en)2000-11-212006-05-16Microsoft CorporationRendering volumetric fog and other gaseous phenomena
US20060103661A1 (en)*2000-11-212006-05-18Microsoft CorporationRendering Volumetric Fog and Other Gaseous Phenomena
US20020180727A1 (en)*2000-11-222002-12-05Guckenberger Ronald JamesShadow buffer control module method and software construct for adjusting per pixel raster images attributes to screen space and projector features for digital warp, intensity transforms, color matching, soft-edge blending, and filtering for multiple projectors and laser projectors
US20020158877A1 (en)*2000-11-222002-10-31Guckenberger Ronald JamesShadow buffer control module method and software construct for adjusting per pixel raster images attributes to screen space and projector features for digital wrap, intensity transforms, color matching, soft-edge blending and filtering for multiple projectors and laser projectors
US7050112B2 (en)*2001-06-012006-05-23Micronas GmbhMethod and device for displaying at least two images within one combined picture
US20020196369A1 (en)*2001-06-012002-12-26Peter RiederMethod and device for displaying at least two images within one combined picture
US20060007406A1 (en)*2002-10-212006-01-12Sean AdkinsEquipment, systems and methods for control of color in projection displays
US20040169724A1 (en)*2002-12-092004-09-02Ekpar Frank EdughomMethod and apparatus for creating interactive virtual tours
US7567274B2 (en)*2002-12-092009-07-28Frank Edughom EkparMethod and apparatus for creating interactive virtual tours
US7336277B1 (en)*2003-04-172008-02-26Nvidia CorporationPer-pixel output luminosity compensation
US7859542B1 (en)2003-04-172010-12-28Nvidia CorporationMethod for synchronizing graphics processing units
US7738036B2 (en)*2003-12-302010-06-15Thomson LicensingSystem and method for smoothing seams in tiled displays
US20070103652A1 (en)*2003-12-302007-05-10Nijim Yousef WSystem and method for smoothing seams in tiled displays
US20050287449A1 (en)*2004-06-282005-12-29Geert MatthysOptical and electrical blending of display images
US7339625B2 (en)*2004-06-282008-03-04Barco N.V.Optical and electrical blending of display images
US8251512B2 (en)2004-07-082012-08-28Imax CorporationEquipment and methods for the display of high resolution images using multiple projection displays
US7619658B2 (en)*2004-11-152009-11-17Hewlett-Packard Development Company, L.P.Methods and systems for producing seamless composite images without requiring overlap of source images
US20060104541A1 (en)*2004-11-152006-05-18Baker Henry HMethods and systems for producing seamless composite images without requiring overlap of source images
US7443364B2 (en)*2005-03-152008-10-28Hewlett-Packard Development Company, L.P.Projection of overlapping sub-frames onto a surface
US9282335B2 (en)2005-03-152016-03-08Hewlett-Packard Development Company, L.P.System and method for coding image frames
US20060209057A1 (en)*2005-03-152006-09-21Niranjan Damera-VenkataProjection of overlapping sub-frames onto a surface
US20090135200A1 (en)*2005-06-282009-05-28Mark Alan SchultzSelective Edge Blending Based on Displayed Content
US7866832B2 (en)*2006-02-152011-01-11Mersive Technologies, LlcMulti-projector intensity blending system
US20100259602A1 (en)*2006-02-152010-10-14Mersive Technologies, Inc.Hybrid system for multi-projector geometry calibration
US7773827B2 (en)2006-02-152010-08-10Mersive Technologies, Inc.Hybrid system for multi-projector geometry calibration
US20070188719A1 (en)*2006-02-152007-08-16Mersive Technologies, LlcMulti-projector intensity blending system
US8358873B2 (en)2006-02-152013-01-22Mersive Technologies, Inc.Hybrid system for multi-projector geometry calibration
US8059916B2 (en)2006-02-152011-11-15Mersive Technologies, Inc.Hybrid system for multi-projector geometry calibration
US20070195285A1 (en)*2006-02-152007-08-23Mersive Technologies, LlcHybrid system for multi-projector geometry calibration
US20090167949A1 (en)*2006-03-282009-07-02David Alan CasperMethod And Apparatus For Performing Edge Blending Using Production Switchers
US20070242240A1 (en)*2006-04-132007-10-18Mersive Technologies, Inc.System and method for multi-projector rendering of decoded video data
US20080180467A1 (en)*2006-04-132008-07-31Mersive Technologies, Inc.Ultra-resolution display technology
US7740361B2 (en)2006-04-212010-06-22Mersive Technologies, Inc.Alignment optimization in image display systems employing multi-camera image acquisition
US20070268306A1 (en)*2006-04-212007-11-22Mersive Technologies, Inc.Image-based parametric projector calibration
US20070273795A1 (en)*2006-04-212007-11-29Mersive Technologies, Inc.Alignment optimization in image display systems employing multi-camera image acquisition
US7893393B2 (en)2006-04-212011-02-22Mersive Technologies, Inc.System and method for calibrating an image projection system
US20080129967A1 (en)*2006-04-212008-06-05Mersive Technologies, Inc.Projector operation through surface fitting of 3d measurements
US7763836B2 (en)2006-04-212010-07-27Mersive Technologies, Inc.Projector calibration using validated and corrected image fiducials
US20080106653A1 (en)*2006-11-062008-05-08Harris Scott C Spatial Light Modulator Techniques for Stage Lighting
US10679241B2 (en)2007-03-292020-06-09The Nielsen Company (Us), LlcAnalysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous system, and effector data
US11790393B2 (en)2007-03-292023-10-17Nielsen Consumer LlcAnalysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous system, and effector data
US8473345B2 (en)2007-03-292013-06-25The Nielsen Company (Us), LlcProtocol generator and presenter device for analysis of marketing and entertainment effectiveness
US8484081B2 (en)2007-03-292013-07-09The Nielsen Company (Us), LlcAnalysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous system, and effector data
US20090024448A1 (en)*2007-03-292009-01-22Neurofocus, Inc.Protocol generator and presenter device for analysis of marketing and entertainment effectiveness
US11250465B2 (en)2007-03-292022-02-15Nielsen Consumer LlcAnalysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous sytem, and effector data
US20090030717A1 (en)*2007-03-292009-01-29Neurofocus, Inc.Intra-modality synthesis of central nervous system, autonomic nervous system, and effector data
US20090024447A1 (en)*2007-03-292009-01-22Neurofocus, Inc.Analysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous sytem, and effector data
US7936361B2 (en)*2007-04-302011-05-03Hewlett-Packard Development Company, L.P.System and method for masking and overlaying images in multiple projector system
US20080266321A1 (en)*2007-04-302008-10-30Richard AufrancSystem and method for masking and overlaying images in multiple projector system
US8386312B2 (en)2007-05-012013-02-26The Nielsen Company (Us), LlcNeuro-informatics repository system
US9886981B2 (en)2007-05-012018-02-06The Nielsen Company (Us), LlcNeuro-feedback based stimulus compression device
US11049134B2 (en)2007-05-162021-06-29Nielsen Consumer LlcNeuro-physiology and neuro-behavioral based stimulus targeting system
US20090024449A1 (en)*2007-05-162009-01-22Neurofocus Inc.Habituation analyzer device utilizing central nervous system, autonomic nervous system and effector system measurements
US10580031B2 (en)2007-05-162020-03-03The Nielsen Company (Us), LlcNeuro-physiology and neuro-behavioral based stimulus targeting system
US8392253B2 (en)2007-05-162013-03-05The Nielsen Company (Us), LlcNeuro-physiology and neuro-behavioral based stimulus targeting system
US8494905B2 (en)2007-06-062013-07-23The Nielsen Company (Us), LlcAudience response analysis using simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI)
US20090036756A1 (en)*2007-07-302009-02-05Neurofocus, Inc.Neuro-response stimulus and stimulus attribute resonance estimator
US10733625B2 (en)2007-07-302020-08-04The Nielsen Company (Us), LlcNeuro-response stimulus and stimulus attribute resonance estimator
US11763340B2 (en)2007-07-302023-09-19Nielsen Consumer LlcNeuro-response stimulus and stimulus attribute resonance estimator
US11244345B2 (en)2007-07-302022-02-08Nielsen Consumer LlcNeuro-response stimulus and stimulus attribute resonance estimator
US20090036755A1 (en)*2007-07-302009-02-05Neurofocus, Inc.Entity and relationship assessment and extraction using neuro-response measurements
US8533042B2 (en)2007-07-302013-09-10The Nielsen Company (Us), LlcNeuro-response stimulus and stimulus attribute resonance estimator
US10937051B2 (en)2007-08-282021-03-02The Nielsen Company (Us), LlcStimulus placement system using subject neuro-response measurements
US8386313B2 (en)2007-08-282013-02-26The Nielsen Company (Us), LlcStimulus placement system using subject neuro-response measurements
US10127572B2 (en)2007-08-282018-11-13The Nielsen Company, (US), LLCStimulus placement system using subject neuro-response measurements
US8392254B2 (en)2007-08-282013-03-05The Nielsen Company (Us), LlcConsumer experience assessment system
US8635105B2 (en)2007-08-282014-01-21The Nielsen Company (Us), LlcConsumer experience portrayal effectiveness assessment system
US11488198B2 (en)2007-08-282022-11-01Nielsen Consumer LlcStimulus placement system using subject neuro-response measurements
US11610223B2 (en)2007-08-292023-03-21Nielsen Consumer LlcContent based selection and meta tagging of advertisement breaks
US10140628B2 (en)2007-08-292018-11-27The Nielsen Company, (US), LLCContent based selection and meta tagging of advertisement breaks
US11023920B2 (en)2007-08-292021-06-01Nielsen Consumer LlcContent based selection and meta tagging of advertisement breaks
US8392255B2 (en)2007-08-292013-03-05The Nielsen Company (Us), LlcContent based selection and meta tagging of advertisement breaks
US8494610B2 (en)2007-09-202013-07-23The Nielsen Company (Us), LlcAnalysis of marketing and entertainment effectiveness using magnetoencephalography
US10963895B2 (en)2007-09-202021-03-30Nielsen Consumer LlcPersonalized content delivery using neuro-response priming data
US20090083129A1 (en)*2007-09-202009-03-26Neurofocus, Inc.Personalized content delivery using neuro-response priming data
US20090262260A1 (en)*2008-04-172009-10-22Mersive Technologies, Inc.Multiple-display systems and methods of generating multiple-display images
US20100079489A1 (en)*2008-10-012010-04-01Ati Technologies UlcSystem and method for efficient digital video composition
US8159505B2 (en)*2008-10-012012-04-17Ati Technologies UlcSystem and method for efficient digital video composition
US8717649B2 (en)*2008-12-242014-05-06Samsung Electronics Co., Ltd.Image processing apparatus and method of controlling the same
US20130120812A1 (en)*2008-12-242013-05-16Samsung Electronics Co., Ltd.Image processing apparatus and method of controlling the same
US20100186031A1 (en)*2009-01-212010-07-22Neurofocus, Inc.Methods and apparatus for providing personalized media in video
US8464288B2 (en)*2009-01-212013-06-11The Nielsen Company (Us), LlcMethods and apparatus for providing personalized media in video
US9826284B2 (en)2009-01-212017-11-21The Nielsen Company (Us), LlcMethods and apparatus for providing alternate media for video decoders
US20100183279A1 (en)*2009-01-212010-07-22Neurofocus, Inc.Methods and apparatus for providing video with embedded media
US8270814B2 (en)*2009-01-212012-09-18The Nielsen Company (Us), LlcMethods and apparatus for providing video with embedded media
US8977110B2 (en)2009-01-212015-03-10The Nielsen Company (Us), LlcMethods and apparatus for providing video with embedded media
US8955010B2 (en)2009-01-212015-02-10The Nielsen Company (Us), LlcMethods and apparatus for providing personalized media in video
US9357240B2 (en)2009-01-212016-05-31The Nielsen Company (Us), LlcMethods and apparatus for providing alternate media for video decoders
US20100195003A1 (en)*2009-02-042010-08-05Seiko Epson CorporationProjector, projection system, image display method, and image display program
US8251515B2 (en)*2009-02-042012-08-28Seiko Epson CorporationProjector, projection system, image display method, and image display program
US20100249538A1 (en)*2009-03-242010-09-30Neurofocus, Inc.Presentation measure using neurographics
US11704681B2 (en)2009-03-242023-07-18Nielsen Consumer LlcNeurological profiles for market matching and stimulus presentation
US20100249636A1 (en)*2009-03-272010-09-30Neurofocus, Inc.Personalized stimulus placement in video games
US20100309336A1 (en)*2009-06-052010-12-09Apple Inc.Skin tone aware color boost for cameras
US8311355B2 (en)*2009-06-052012-11-13Apple Inc.Skin tone aware color boost for cameras
US20110046504A1 (en)*2009-08-202011-02-24Neurofocus, Inc.Distributed neuro-response data collection and analysis
US20110046502A1 (en)*2009-08-202011-02-24Neurofocus, Inc.Distributed neuro-response data collection and analysis
US8655437B2 (en)2009-08-212014-02-18The Nielsen Company (Us), LlcAnalysis of the mirror neuron system for evaluation of stimulus
US10987015B2 (en)2009-08-242021-04-27Nielsen Consumer LlcDry electrodes for electroencephalography
US20110078762A1 (en)*2009-09-292011-03-31Ebay, Inc.Mobile or user device authentication and tracking
US8209224B2 (en)2009-10-292012-06-26The Nielsen Company (Us), LlcIntracluster content management using neuro-response priming data
US8762202B2 (en)2009-10-292014-06-24The Nielson Company (Us), LlcIntracluster content management using neuro-response priming data
US9560984B2 (en)2009-10-292017-02-07The Nielsen Company (Us), LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US11170400B2 (en)2009-10-292021-11-09Nielsen Consumer LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US11669858B2 (en)2009-10-292023-06-06Nielsen Consumer LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US10269036B2 (en)2009-10-292019-04-23The Nielsen Company (Us), LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US10068248B2 (en)2009-10-292018-09-04The Nielsen Company (Us), LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US11481788B2 (en)2009-10-292022-10-25Nielsen Consumer LlcGenerating ratings predictions using neuro-response data
US20110106621A1 (en)*2009-10-292011-05-05Neurofocus, Inc.Intracluster content management using neuro-response priming data
US20110119124A1 (en)*2009-11-192011-05-19Neurofocus, Inc.Multimedia advertisement exchange
US8335715B2 (en)2009-11-192012-12-18The Nielsen Company (Us), Llc.Advertisement exchange using neuro-response data
US8335716B2 (en)2009-11-192012-12-18The Nielsen Company (Us), Llc.Multimedia advertisement exchange
US20110119129A1 (en)*2009-11-192011-05-19Neurofocus, Inc.Advertisement exchange using neuro-response data
US20110128294A1 (en)*2009-11-272011-06-02Canon Kabushiki KaishaImage processing apparatus and image processing method
US8727538B2 (en)*2009-11-272014-05-20Canon Kabushiki KaishaImage processing apparatus and image processing method
CN101727878B (en)*2009-12-152014-08-13王晓年Single host projection integration system and realizing method thereof
CN101727878A (en)*2009-12-152010-06-09王晓年Single host projection integration system and realizing method thereof
US20110237971A1 (en)*2010-03-252011-09-29Neurofocus, Inc.Discrete choice modeling using neuro-response data
US9454646B2 (en)2010-04-192016-09-27The Nielsen Company (Us), LlcShort imagery task (SIT) research method
US11200964B2 (en)2010-04-192021-12-14Nielsen Consumer LlcShort imagery task (SIT) research method
US10248195B2 (en)2010-04-192019-04-02The Nielsen Company (Us), Llc.Short imagery task (SIT) research method
US9336535B2 (en)2010-05-122016-05-10The Nielsen Company (Us), LlcNeuro-response data synchronization
US8655428B2 (en)2010-05-122014-02-18The Nielsen Company (Us), LlcNeuro-response data synchronization
US8392250B2 (en)2010-08-092013-03-05The Nielsen Company (Us), LlcNeuro-response evaluated stimulus in virtual reality environments
US8392251B2 (en)2010-08-092013-03-05The Nielsen Company (Us), LlcLocation aware presentation of stimulus material
US8396744B2 (en)2010-08-252013-03-12The Nielsen Company (Us), LlcEffective virtual reality environments for presentation of marketing materials
US8548852B2 (en)2010-08-252013-10-01The Nielsen Company (Us), LlcEffective virtual reality environments for presentation of marketing materials
US10881348B2 (en)2012-02-272021-01-05The Nielsen Company (Us), LlcSystem and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
US9569986B2 (en)2012-02-272017-02-14The Nielsen Company (Us), LlcSystem and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
US20130258289A1 (en)*2012-03-292013-10-03Seiko Epson CorporationProjection type display device, display system, and display method
US9004694B2 (en)*2012-03-292015-04-14Seiko Epson CorporationProjection type display device used with another projection type display device for projecting overlapping images, display system, and display
US9756287B2 (en)2012-04-302017-09-05Hewlett-Packard Development Company, L.P.System and method for providing a two-way interactive 3D experience
US9516270B2 (en)*2012-04-302016-12-06Hewlett-Packard Development Company, L.P.System and method for providing a two-way interactive 3D experience
US20150288921A1 (en)*2012-04-302015-10-08Hewlett-Packard Development Company, L.P.System and method for providing a two-way interactive 3d experience
US9215978B2 (en)2012-08-172015-12-22The Nielsen Company (Us), LlcSystems and methods to gather and analyze electroencephalographic data
US11980469B2 (en)2012-08-172024-05-14Nielsen CompanySystems and methods to gather and analyze electroencephalographic data
US8989835B2 (en)2012-08-172015-03-24The Nielsen Company (Us), LlcSystems and methods to gather and analyze electroencephalographic data
US9060671B2 (en)2012-08-172015-06-23The Nielsen Company (Us), LlcSystems and methods to gather and analyze electroencephalographic data
US10779745B2 (en)2012-08-172020-09-22The Nielsen Company (Us), LlcSystems and methods to gather and analyze electroencephalographic data
US10842403B2 (en)2012-08-172020-11-24The Nielsen Company (Us), LlcSystems and methods to gather and analyze electroencephalographic data
US9907482B2 (en)2012-08-172018-03-06The Nielsen Company (Us), LlcSystems and methods to gather and analyze electroencephalographic data
US11076807B2 (en)2013-03-142021-08-03Nielsen Consumer LlcMethods and apparatus to gather and analyze electroencephalographic data
US9668694B2 (en)2013-03-142017-06-06The Nielsen Company (Us), LlcMethods and apparatus to gather and analyze electroencephalographic data
US9320450B2 (en)2013-03-142016-04-26The Nielsen Company (Us), LlcMethods and apparatus to gather and analyze electroencephalographic data
US9424809B1 (en)2013-07-152016-08-23Google Inc.Patterned projection with multi-panel display
CN105940443B (en)*2013-07-252019-02-26英派尔科技开发有限公司Composite display with multiple imaging attributes
CN105940443A (en)*2013-07-252016-09-14英派尔科技开发有限公司Composite display with multiple imaging attributes
WO2015088171A1 (en)*2013-12-092015-06-18Cj Cgv Co., Ltd.Method of correcting image overlap area, recording medium, and execution apparatus
US9558541B2 (en)2013-12-092017-01-31Cj Cgv Co., Ltd.Method of correcting image overlap area, recording medium, and execution apparatus
KR20150066936A (en)*2013-12-092015-06-17씨제이씨지브이 주식회사Method for image correction at ovelapped region of image, computer readable medium and executing device thereof
US20150276142A1 (en)*2014-03-252015-10-01Nichia CorporationDisplay system
US9557032B2 (en)*2014-03-252017-01-31Nichia CorporationDisplay system
US9622703B2 (en)2014-04-032017-04-18The Nielsen Company (Us), LlcMethods and apparatus to gather and analyze electroencephalographic data
US9622702B2 (en)2014-04-032017-04-18The Nielsen Company (Us), LlcMethods and apparatus to gather and analyze electroencephalographic data
US11141108B2 (en)2014-04-032021-10-12Nielsen Consumer LlcMethods and apparatus to gather and analyze electroencephalographic data
US9554105B2 (en)*2014-09-252017-01-24Canon Kabushiki KaishaProjection type image display apparatus and control method therefor
US20160094821A1 (en)*2014-09-252016-03-31Canon Kabushiki KaishaProjection type image display apparatus and control method therefor
US10146304B2 (en)2014-11-102018-12-04Irisvision, Inc.Methods and apparatus for vision enhancement
US11372479B2 (en)2014-11-102022-06-28Irisvision, Inc.Multi-modal vision enhancement system
US10496352B2 (en)2015-03-032019-12-03Aten International Co., Ltd.Calibration system and method for multi-image output system
US11144119B2 (en)2015-05-012021-10-12Irisvision, Inc.Methods and systems for generating a magnification region in output video images
US9910275B2 (en)*2015-05-182018-03-06Samsung Electronics Co., Ltd.Image processing for head mounted display devices
US10527846B2 (en)2015-05-182020-01-07Samsung Electronics Co., Ltd.Image processing for head mounted display devices
US10684467B2 (en)2015-05-182020-06-16Samsung Electronics Co., Ltd.Image processing for head mounted display devices
US10771844B2 (en)2015-05-192020-09-08The Nielsen Company (Us), LlcMethods and apparatus to adjust content presented to an individual
US11290779B2 (en)2015-05-192022-03-29Nielsen Consumer LlcMethods and apparatus to adjust content presented to an individual
US9936250B2 (en)2015-05-192018-04-03The Nielsen Company (Us), LlcMethods and apparatus to adjust content presented to an individual
US20230419437A1 (en)*2016-09-302023-12-28Qualcomm IncorporatedSystems and methods for fusing images
US10963999B2 (en)2018-02-132021-03-30Irisvision, Inc.Methods and apparatus for contrast sensitivity compensation
US11475547B2 (en)2018-02-132022-10-18Irisvision, Inc.Methods and apparatus for contrast sensitivity compensation
US11094139B2 (en)*2018-06-262021-08-17Quantificare S.A.Method and device to simulate, visualize and compare surface models
US11546527B2 (en)2018-07-052023-01-03Irisvision, Inc.Methods and apparatuses for compensating for retinitis pigmentosa
US12388973B2 (en)2018-08-302025-08-12Hermelo MirandaSystems and method for capturing, processing, and displaying a 360° video
US11284054B1 (en)2018-08-302022-03-22Largo Technology Group, LlcSystems and method for capturing, processing and displaying a 360° video
CN110225323A (en)*2019-06-042019-09-10成都信息工程大学High-precision multi-projector image entirety and edge brightness fusing device
US12374013B2 (en)2021-05-052025-07-29Disney Enterprises, Inc.Distribution of sign language enhanced content
US12205211B2 (en)2021-05-052025-01-21Disney Enterprises, Inc.Emotion-based sign language enhancement of content
US20220360839A1 (en)*2021-05-052022-11-10Disney Enterprises, Inc.Accessibility Enhanced Content Delivery
US12395690B2 (en)*2021-05-052025-08-19Disney Enterprises, Inc.Accessibility enhanced content delivery
US20230331162A1 (en)*2022-04-132023-10-19Panasonic Intellectual Property Management Co., Ltd.Display controller
WO2025189366A1 (en)*2024-03-132025-09-18Qualcomm IncorporatedDisplay processing unit extension for very large aspect ratio automotive display screen

Similar Documents

PublicationPublication DateTitle
US6545685B1 (en)Method and system for efficient edge blending in high fidelity multichannel computer graphics displays
US7336277B1 (en)Per-pixel output luminosity compensation
US5805782A (en)Method and apparatus for projective texture mapping rendered from arbitrarily positioned and oriented light source
US5465121A (en)Method and system for compensating for image distortion caused by off-axis image projection
JP4481166B2 (en) Method and system enabling real-time mixing of composite and video images by a user
US6268859B1 (en)Method and system for rendering overlapping opaque graphical objects in graphic imaging systems
US5077608A (en)Video effects system able to intersect a 3-D image with a 2-D image
US4609917A (en)Three-dimensional display system
RU2312404C2 (en)Hardware acceleration of graphical operations during construction of images based on pixel sub-components
US20050231505A1 (en)Method for creating artifact free three-dimensional images converted from two-dimensional images
US6573889B1 (en)Analytic warping
US6469700B1 (en)Per pixel MIP mapping and trilinear filtering using scanline gradients for selecting appropriate texture maps
US5995110A (en)Method and system for the placement of texture on three-dimensional objects
US20030001856A1 (en)Image displaying system and information processing apparatus
US20070008344A1 (en)Manipulation of Projected Images
US20020154122A1 (en)Generating three dimensional text
KR20160051154A (en)Rendering method and apparatus, and electronic apparatus
JP2007251914A (en)Image signal processing apparatus, and virtual reality creating system
US7656416B2 (en)Apparatus for generating anti-aliased and stippled 3d lines, points and surfaces using multi-dimensional procedural texture coordinates
US6304300B1 (en)Floating point gamma correction method and system
GB2317090A (en)An electronic graphics system
US8587608B2 (en)Preventing pixel modification of an image based on a metric indicating distortion in a 2D representation of a 3D object
JP4707782B2 (en) Image processing apparatus and method
KR100799304B1 (en) System and method for high resolution video projection in non-planar display environment
JP4174133B2 (en) Image generation method

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:SILICON GRAPHICS, INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DORBIE, ANGUS;REEL/FRAME:009863/0222

Effective date:19990319

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:WELLS FARGO FOOTHILL CAPITAL, INC.,CALIFORNIA

Free format text:SECURITY AGREEMENT;ASSIGNOR:SILICON GRAPHICS, INC. AND SILICON GRAPHICS FEDERAL, INC. (EACH A DELAWARE CORPORATION);REEL/FRAME:016871/0809

Effective date:20050412

Owner name:WELLS FARGO FOOTHILL CAPITAL, INC., CALIFORNIA

Free format text:SECURITY AGREEMENT;ASSIGNOR:SILICON GRAPHICS, INC. AND SILICON GRAPHICS FEDERAL, INC. (EACH A DELAWARE CORPORATION);REEL/FRAME:016871/0809

Effective date:20050412

FPAYFee payment

Year of fee payment:4

ASAssignment

Owner name:GENERAL ELECTRIC CAPITAL CORPORATION,CALIFORNIA

Free format text:SECURITY INTEREST;ASSIGNOR:SILICON GRAPHICS, INC.;REEL/FRAME:018545/0777

Effective date:20061017

Owner name:GENERAL ELECTRIC CAPITAL CORPORATION, CALIFORNIA

Free format text:SECURITY INTEREST;ASSIGNOR:SILICON GRAPHICS, INC.;REEL/FRAME:018545/0777

Effective date:20061017

ASAssignment

Owner name:MORGAN STANLEY & CO., INCORPORATED, NEW YORK

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAL ELECTRIC CAPITAL CORPORATION;REEL/FRAME:019995/0895

Effective date:20070926

Owner name:MORGAN STANLEY & CO., INCORPORATED,NEW YORK

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAL ELECTRIC CAPITAL CORPORATION;REEL/FRAME:019995/0895

Effective date:20070926

FPAYFee payment

Year of fee payment:8

ASAssignment

Owner name:GRAPHICS PROPERTIES HOLDINGS, INC., NEW YORK

Free format text:CHANGE OF NAME;ASSIGNOR:SILICON GRAPHICS, INC.;REEL/FRAME:028066/0415

Effective date:20090604

ASAssignment

Owner name:RPX CORPORATION, CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GRAPHICS PROPERTIES HOLDINGS, INC.;REEL/FRAME:029564/0799

Effective date:20121224

FPAYFee payment

Year of fee payment:12

ASAssignment

Owner name:JEFFERIES FINANCE LLC, NEW YORK

Free format text:SECURITY INTEREST;ASSIGNOR:RPX CORPORATION;REEL/FRAME:046486/0433

Effective date:20180619

ASAssignment

Owner name:RPX CORPORATION, CALIFORNIA

Free format text:RELEASE BY SECURED PARTY;ASSIGNOR:JEFFERIES FINANCE LLC;REEL/FRAME:054486/0422

Effective date:20201023


[8]ページ先頭

©2009-2025 Movatter.jp