Movatterモバイル変換


[0]ホーム

URL:


US9489043B2 - Display device and controlling method - Google Patents

Display device and controlling method
Download PDF

Info

Publication number
US9489043B2
US9489043B2US14/589,107US201514589107AUS9489043B2US 9489043 B2US9489043 B2US 9489043B2US 201514589107 AUS201514589107 AUS 201514589107AUS 9489043 B2US9489043 B2US 9489043B2
Authority
US
United States
Prior art keywords
user
section
control information
system control
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/589,107
Other versions
US20150185830A1 (en
Inventor
Yuuichi Iida
Shinichi Hayashi
Yusuke Sakai
Junji OI
Daijiro Sakuma
Shingo Tsurumi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Saturn Licensing LLC
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony CorpfiledCriticalSony Corp
Priority to US14/589,107priorityCriticalpatent/US9489043B2/en
Publication of US20150185830A1publicationCriticalpatent/US20150185830A1/en
Application grantedgrantedCritical
Publication of US9489043B2publicationCriticalpatent/US9489043B2/en
Assigned to SATURN LICENSING LLCreassignmentSATURN LICENSING LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: SONY CORPORATION
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

An image display device and controlling method capable of optimizing a state of the image display device for a user at a desired position. The display device includes: an imaging section that takes an image of a predetermined range of a dynamic image with respect to an image display direction; an image analyzing section that analyzes the dynamic image taken by the imaging section and calculates a position of a user; a system optimization processing section that calculates system control information for optimizing a system based on the position of the user calculated by the image analyzing section; and a system controlling section that optimizes the system based on the system control information calculated by the system optimization processing section.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
This application is a continuation of U.S. application Ser. No. 13/395,035, filed Jun. 21, 2012, which is a National Stage of PCT/JP2010/062311, filed Jul. 22, 2010, which claims the benefit of priority to Japanese Patent Application No. 2009-213377, filed Sep. 15, 2009, the contents of which is incorporated in its entirety.
TECHNICAL FIELD
The present invention relates to a display device and a controlling method.
BACKGROUND ART
In the recent years, accompanying an expansion of a flat-screen television market, demands for an image display device such as a large screen television for installing in for example a living room is increasing. In such a situation, an image display device having various functions is being proposed.
SUMMARY OF INVENTIONTechnical Problem
Incidentally, since a user can watch an image that an image display device is displaying at any desired position, depending on a viewing position of the user, a state of the image display device, namely an audio property such as a volume balance of sound output from an audio output section, an image property of an image display section of the image display device and display contents, a displaying direction of the image display device and the like may not be optimal for the user at the desired position.
Thus, the present invention has been made in view of the above problem, and an aim of the present invention is to provide a novel and improved image display device and controlling method capable of optimizing the state of the image display device for the user at the desired position.
Solution to Problem
To solve the above problem, according to an aspect of the present invention, a display device including an imaging section that takes an image of a predetermined range of a dynamic image with respect to an image display direction; an image analyzing section that analyzes the dynamic image taken by the imaging section and calculates a position of a user; a system optimization processing section that calculates system control information for optimizing a system based on the position of the user calculated by the image analyzing section; and a system controlling section that optimizes the system based on the system control information calculated by the system optimization processing section is provided.
The system optimization processing section may calculate system control information for optimizing a volume balance of sound output from an audio output section based on the position of the user calculated by the image analyzing section.
The system optimization processing section may calculate system control information for optimizing an image property of an image display section based on the position of the user calculated by the image analyzing section.
The system optimization processing section may calculate system control information for optimizing display contents of an image display section based on the position of the user calculated by the image analyzing section.
The system optimization processing section may calculate system control information for optimizing a device direction of the display device itself based on the position of the user calculated by the image analyzing section.
The image analyzing section may analyze the dynamic image taken by the imaging section and calculate a three-dimensional position of the user.
The image analyzing section may analyze the dynamic image taken by the imaging section and calculate respective positions of a plurality of users, and the system optimization processing section may calculate positions of center of balance of the plurality of users based on the positions of the plurality of users calculated by the image analyzing section, and calculate system control information for optimizing a system based on the calculated positions of center of balance of the plurality of users.
Further, to solve the above problem, according to another aspect of the present invention, a controlling method including an imaging step of taking an image of a predetermined range of a dynamic image with respect to an image display direction; an image analyzing step of analyzing the taken dynamic image and calculating a position of a user; a system optimization processing step of calculating system control information for optimizing a system based on the calculated position of the user; and a system controlling step of optimizing the system based on the calculated system control information is provided.
Advantageous Effects of Invention
As explained above, according to the present invention, a novel and improved image display device and controlling method capable of optimizing the state of the image display device for the user at the desired position can be provided.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is an explanatory diagram that explains an outer appearance of animage display device100 in an embodiment of the present invention.
FIG. 2 is an explanatory diagram that explains a configuration of theimage display device100 of the embodiment of the present invention.
FIG. 3 is an explanatory diagram that explains a configuration of acontrol section110.
FIG. 4 is an explanatory diagram that explains the configuration of thecontrol section110.
FIG. 5(A) is an explanatory diagram for explaining a case where auser 1 and a user 2 are present in an imaging area of animaging section104, andFIG. 5(B) is an explanatory diagram for explaining a face detecting position [a1, b1] and a face size [w1, h1] of theuser 1, and a face detecting position [a2, b2] and a face size [w2, h2] of the user 2 that are included in an image taken by theimaging section104.
FIG. 6(A) is an explanatory diagram for explaining a case where users are present at a reference distance d0 and a distance d1 in the imaging area of theimaging section104,FIG. 6(B) is an explanatory diagram for explaining the face size [w1, h1] of the user at the distance d1 in the image taken by theimaging section104, andFIG. 6(C) is an explanatory diagram for explaining a reference face size [w0, h0] of the user at the reference distance d0 in the image taken by theimaging section104.
FIG. 7(A) is an explanatory diagram for explaining a device center [0,0,0] of theimage display device100 and a camera position [Δx, Δy, Δz] of theimaging section104, andFIG. 7(B) is an explanatory diagram for explaining the device center [0,0,0] of theimage display device100, a frontward axis [0,0], the camera position [Δx, Δy, Δz] of theimaging section104, and an installation angle [Δφ, Δθ].
FIG. 8 is a flow diagram showing an example of an optimization process in accordance with a position of the user by theimage display device100 of the embodiment of the present invention.
FIG. 9 is a flow diagram showing an example of an optimization process in accordance with positions of one or more users by theimage display device100 of the embodiment of the present invention.
FIG. 10 is a flow diagram showing an example of an optimization process in accordance with ages of one or more users by theimage display device100 of the embodiment of the present invention.
FIG. 11 is an explanatory diagram for explaining an optimization process of a volume balance.
FIG. 12 is an explanatory diagram for explaining a calculation method of positions of center of balance of a user group.
FIG. 13(A) is an explanatory diagram for explaining a position [D0, Ah0] of a user A and a position [D1, Ah1] of a user B in a horizontal direction, andFIG. 13(B) is an explanatory diagram for explaining a position [D0, AV0] of the user A and a position [D, Av1] of the user B in a vertical direction.
FIG. 14(A) andFIG. 14(B) are explanatory diagrams for explaining an optimization process of a letter size of a GUI.
FIG. 15(A) toFIG. 15(C) are explanatory diagrams for explaining a correction method of the reference face size [w0, h0] at the reference distance d0 in a calculation of the user distance.
DESCRIPTION OF EMBODIMENTS
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the appended drawings. Note that, in this specification and the drawings, elements that have substantially the same function and structure are denoted with the same reference signs, and repeated explanation is omitted.
Note that, explanation will be given in the following order.
<1. One Embodiment of the Present Invention>
[1-1. Configuration of Image Display Device]
[1-2. Configuration of Control Section]
[1-3. Optimization Process according to User Position]
[1-4. Optimization Process according to User Positions of One or More Users]
[1-5. Optimization Process according to Ages of One or More Users]
1. One Embodiment of the Present Invention
[1-1. Configuration of Image Display Device]
Firstly, a configuration of an image display device of an embodiment of the present invention will be explained.FIG. 1 is an explanatory diagram that explains an outer appearance of animage display device100 in the embodiment of the present invention.FIG. 1 is a front view diagram in viewing theimage display device100 from a front side. Hereinbelow, the outer appearance of theimage display device100 of the embodiment of the present invention will be explained with reference toFIG. 1.
As shown inFIG. 1, theimage display device100 of the embodiment of the present invention includesimaging sections104 that take a dynamic image arranged at an upper center position and left and right center positions of adisplay panel section102 that displays a still image or a dynamic image. Theimaging sections104 take the dynamic image with respect to a direction by which theimage display device100 displays the still image or the dynamic image in thedisplay panel section102. Theimage display device100 of the embodiment analyzes an image taken by theimaging sections104, and detects a face of a user taken in the image. Theimage display device100 analyzes the detected face of the user, and detects a face detecting position and a face size. Theimage display device100 calculates a relative position of the user with respect to an optical axis of a camera of therespective imaging sections104 based on the detected face detecting position and face size. Then, theimage display device100 calculates a position of the user with respect to a device center and a frontward axis of theimage display device100 based on a calculation result of the relative position of the user with respect to the optical axis of the camera of therespective imaging sections104 and appended information such as a position, an angle of the camera of therespective imaging sections104. Theimage display device100 of the embodiment characteristically optimizes a state of theimage display device100 in accordance with the position of the user; that is, an audio property such as sound volume, an image property, display contents, a device direction with respect to the position of the user.
Further, theimage display device100 of the embodiment includes asensor section106 arranged at a lower center position of thedisplay panel section102. Thesensor section106 detects presence/absence of a person in front of theimage display device100.
Note that, inFIG. 1, although theimage display device100 had theimaging sections104 that take the dynamic image at three portions around thedisplay panel section102, it is needless to say that positions of theimaging sections104 that take the dynamic image are not limited to the above example in the present invention; for example, a device independent from theimage display device100 may be provided, the device may be connected to theimage display device100, and the dynamic image may be taken by this device. Further, it is needless to say that a number of theimaging section104 is not limited to three; images may be taken by providing two or less, or four ormore imaging sections104. Moreover, it is needless to say that a number of thesensor section106 is not limited to one; and two or more sensor sections may be provided.
Further, although not shown inFIG. 1, theimage display device100 may further include a signal receiving section that can receive a control signal from a remote controller (not shown) through an infrared scheme, wireless scheme.
As above, the outer appearance of theimage display device100 of the embodiment of the present invention has been explained with reference toFIG. 1. Next, a configuration of theimage display device100 of the embodiment of the present invention will be explained.
FIG. 2 is an explanatory diagram that explains the configuration of theimage display device100 of the embodiment of the present invention. Hereinbelow, the configuration of theimage display device100 of the embodiment of the present invention will be explained with reference toFIG. 2.
As shown inFIG. 2, theimage display device100 of the embodiment of the present invention is configured by including thedisplay panel section102, theimaging sections104, thesensor section106, aspeaker section108, amechanism section109, and acontrol section110.
Further, thecontrol section110 is configured by including animage input section112, animage processing section114, a viewingstate analyzing section116, a user positioninformation storing section117, a viewingstate recording section118, a systemoptimization processing section120, and asystem control section122.
Thedisplay panel section102 displays the still image or the dynamic image based on a panel drive signal. In the embodiment, thedisplay panel section102 displays the still image or the dynamic image using liquid crystals. Of course in the invention, it is needless to say that thedisplay panel section102 is not limited to the above example. Thedisplay panel section102 may display the still image or the dynamic image using a self-emitting display device such as an organic EL (electroluminescence).
As described above, theimaging sections104 are arranged at the upper center position and the left and right center positions of thedisplay panel section102 that displays the still image or the dynamic image. Theimaging sections104 take the dynamic image with respect to the direction by which theimage display device100 displays the dynamic image in thedisplay panel section102 when the panel drive signal is supplied to thedisplay panel section102 and thedisplay panel section102 is displaying the dynamic image. Theimaging sections104 may take the dynamic image by CCD (Charge Coupled Device) image sensors, or may take the dynamic image by CMOS (Complementary Metal Oxide Semiconductor) image sensors. The dynamic image taken by theimaging sections104 is sent to thecontrol section110.
Thesensor section106 is provided at the lower center position of thedisplay panel section102 that displays the still image or the dynamic image, and is for example for detecting the presence/absence of a person in front of theimage display device100. Further, if a person is present in front of theimage display device100, thesensor section106 can detect a distance between theimage display device100 and that person. A detection result and distance information by thesensor section106 are sent to thecontrol section110. Thespeaker section108 outputs sounds based on a sound outputting signal. Themechanism section109 controls a displaying direction of thedisplay panel102 of theimage display device100 for example based on a drive signal.
Thecontrol section110 controls operations of theimage display device100. Hereinbelow, respective sections of thecontrol section110 will be explained.
Theimage input section112 receives the dynamic image taken in theimaging sections104. The dynamic image received by theimage input section112 is sent to theimage processing section114, and is used in image processes in theimage processing section114.
Theimage processing section114 is an example of an image analyzing section of the present invention, and performs the respective image processes on the dynamic image that is taken by theimaging sections104 and sent from theimage input section112. The image processes performed by theimage processing section114 includes a detection process for an object included in the dynamic image taken by theimaging sections104, a detection process for a number of persons included in the dynamic image, and a detection process for a face and a facial expression included in the dynamic image. Results of the respective image processes by theimage processing section114 are sent to the viewingstate analyzing section116, and are used for analyses of presence/absence of a person watching theimage display device100, and a viewing state and a viewing position of the person.
As for the face detection process to detect the face included in the image by theimage processing section114, techniques described for example in Japanese Patent Application Publication No. 2007-65766 and Japanese Patent Application Publication No. 2005-44330 can be used. Hereinbelow, the face detection process will briefly be explained.
In order to detect the face of the user from the image, firstly, a face position, a face size, and a face direction in the supplied image are respectively detected. By detecting the position and size of the face, a face image portion can be cut out from the image. Then, based on the cut-out face image and information of the face direction, feature portions of the face (facial feature positions), for example, eyebrows, eyes, nose, mouth, and the like, are detected. In the detection of the facial feature positions, for example, a method called an AAM (Active Appearance Models) may be adapted so as to be capable of detecting the feature positions.
When the facial feature positions are detected, a local feature value is calculated for each of the detected facial feature positions. By calculating the local feature values and storing the calculated local feature values together with the face image, face recognition becomes possible from the image taken by theimaging section104. As for the method of face recognition, since techniques described for example in Japanese Patent Application Publication No. 2007-65766 and Japanese Patent Application Publication No. 2005-44330 can be used, thus a detailed description thereof will herein be omitted. Further, it is also possible to determine whether the face taken in the supplied image is a male or a female, and how old the person is, by the face image and the face feature position. Further, by predeterminedly recording information of faces, the person taken in the supplied image can be searched among the faces, and an individual can be specified.
The viewingstate analyzing section116 is an example of an image analyzing section of the present invention. It receives a result of the respective image processes by theimage processing section114 and a detection result and distance information by the detection of thesensor section106, and performs an analysis of the viewing state and the viewing position of the person watching the image displayed by theimage display device100 by using the result of the respective image processes by theimage processing section114 and the detection result and the distance information by the detection of thesensor section106. By the viewingstate analyzing section116 analyzing the viewing state and the viewing position of the person watching, theimage display device100 can control the audio property such as a sound balance of an output from thespeaker section108, control the image property of thedisplay panel section102, control the display contents of thedisplay panel section102, and control the display direction of thedisplay panel section102 by using themechanism section109 in accordance with the viewing position of the user watching theimage display device100. An analysis result of an analysis process by the viewingstate analyzing section116 is sent to the viewingstate recording section118, the user positioninformation storing section117 and the systemoptimization process section120.
Note that, in a case where although a dynamic body is detected from the detection result and the distance information by the detection of thesensor section106 but a distance between thesensor section106 and that dynamic body is at or exceeding a predetermined distance, the viewingstate analyzing section116 can exclude that object from being a detection target.
The viewingstate recording section118 records the analysis result obtained by the analysis process by the viewingstate analyzing section116. The analysis result in the viewingstate analyzing section116 recorded by the viewingstate recording section118 is used in the system optimization process by the systemoptimization processing section120. Further, the analysis result in the viewingstate analyzing section116 recorded by the viewingstate recording section118 may be sent to an externaldata collection server200.
The user positioninformation storing section117 stores the analysis result of an analysis process by the viewingstate analyzing section116.
The systemoptimization processing section120 calculates the system control information for performing the system optimization process on the respective sections of theimage display device100 by using the analysis result obtained by the analysis process of the viewingstate analyzing section116. The system optimization process on the respective sections of theimage display device100 includes the control of the audio property such as the sound balance of the output from thespeaker section108, the control of the image property of thedisplay panel section102, the control of the display contents of thedisplay panel section102, and the control of the display direction of thedisplay panel section102 of theimage display device100 by themechanism section109.
Theimage display device100 can perform the optimization process in accordance with the position of the user based on the system control information calculated by the systemoptimization processing section120. The system control information calculated by the systemoptimization processing section120 is sent to thesystem control section122.
Thesystem control section122 performs the system optimization process on the respective sections of theimage display device100 based on the system control information calculated by the systemoptimization processing section120. For example, thesystem control section122 performs the control of the volume balance of the output from thespeaker section108, the control of the image property of thedisplay panel section102, the control of the display contents of thedisplay panel section102, the control of the display direction of thedisplay panel section102 of theimage display device100 by themechanism section109 and the like based on the system control information calculated by the systemoptimization processing section120.
As above, the configuration of theimage display device100 of the embodiment of the present invention has been explained with reference toFIG. 2. Next, a configuration of thecontrol section110 included in theimage display device100 of the embodiment of the present invention will be explained in more detail.
[1-2. Configuration of Control Section]
FIG. 3 is an explanatory diagram that explains the configuration of thecontrol section110 included in theimage display device100 of the embodiment of the present invention. Within thecontrol section110,FIG. 3 explains a configuration of the viewingstate analyzing section116 included in thecontrol section110. Hereinbelow, the configuration of the viewingstate analyzing section116 will be explained with reference toFIG. 3.
As shown inFIG. 3, the viewingstate analyzing section116 is configured by including a user direction/distance calculating section132 and a user positioninformation calculating section134.
The user direction/distance calculating section132 receives the results of the respective image processes by theimage processing section114 and optical information such as information of an angle of view and a resolution of each camera of theimaging sections104, and calculates a relative position of the user (direction [φ1, ƒ1], distance d1) with respect to the optical axis of each camera of theimaging sections104 by using the results of the respective image processes by theimage processing section114 and the optical information from theimaging sections104. Here, the taken image, as well as face detection information (for example, information such as the face detecting position [a1, b1], the face size [w1, h1], and other attribute information such as the age and sex) for each user using theimage display device100 are sent from theimage processing section114 to the user direction/distance calculating section132 of the viewingstate analyzing section116.FIG. 5(A) is an explanatory diagram for explaining a case where auser 1 and a user 2 are present in the imaging area of theimaging section104, andFIG. 5(B) is an explanatory diagram for explaining the face detecting position [a1, b1] and the face size [w1, h1] of theuser 1, and the face detecting position [a2, b2] and the face size [w2, h2] of the user 2 that are included in the image taken by theimaging section104. Further,FIG. 6(A) is an explanatory diagram for explaining a case where users are present at a reference distance d0 and a distance d1 in the imaging area of one of theimaging sections104,FIG. 6(B) is an explanatory diagram for explaining the face size [w1, h1] of the user at the distance d1 in the image taken by theimaging section104, andFIG. 6(C) is an explanatory diagram for explaining a reference face size [w0, h0] of the user at the reference distance d0 in the image taken by theimaging section104.
The direction [φ1, θ1] is calculated as follows from the face detecting position [a1, b1] normalized by a taken image size [xmax, ymax] and the angle of view [φ0, θ0] of the camera of theimaging section104.
Horizontal direction: φ1=φ0*a1
Vertical direction: θ1=θ0*b1
Further, the distance d1 is calculated as follows from the reference face size [w0, h0] at the reference distance d0.
Distance:d1=d0*(w0/w1)
The user positioninformation calculating section134 receives a calculation result of the relative position of the user with respect to the optical axis of the camera of therespective imaging sections104 by the user direction/distance calculating section132 and the appended information such as the position, the angle of the camera of therespective imaging sections104, and calculates a three-dimensional position of the user with respect to the device center of theimage display device100 and the frontward axis by using the calculation result of the relative position of the user by the user direction/distance calculating section132 and the appended information of therespective imaging sections104. The user position information calculated by the user positioninformation calculating section134 is sent to the user positioninformation storing section117.FIG. 7(A) is an explanatory diagram for explaining the device center [0,0,0] of theimage display device100 and a camera position [Δx, Δy, Δz] of theimaging section104, andFIG. 7(B) is an explanatory diagram for explaining the device center [0,0,0] of theimage display device100, the frontward axis [0,0], the camera position [Δx, Δy, Δz] of theimaging section104, and an installation angle [Δφ, Δθ].
When the relative position of the user with respect to the optical axis of the camera of theimaging section104 is of the direction [φ1, θ1] and distance d1, the device center of theimage display device100 is [0,0,0], and, with respect to the frontward axis [0,0], a positional difference [Δx, Δy, Δz] is the camera position of theimaging section104, an angular difference [Δφ, Δθ] is the installation angle, a position [x1, y1, z1] of the user with respect to the device center [0,0,0] of theimage display device100 is calculated as follows.
x1=d1*cos(θ1−Δθ)*tan(φ1−Δφ)−Δx
y1=d1*tan(θ1−Δθ)−Δy
z1=d1*cos(θ1−Δθ)*cos(φ1−Δθ)−Δz
FIG. 4 is an explanatory diagram that explains the configuration of thecontrol section110 included in theimage display device100 of the embodiment of the present invention. Within thecontrol section110,FIG. 4 explains configurations of the user positioninformation storing section117, the systemoptimization processing section120 and thesystem control section122 included in thecontrol section110. Hereinbelow, the configurations of the user positioninformation storing section117, the systemoptimization processing section120 and thesystem control section122 will be explained with reference toFIG. 3.
As shown inFIG. 4, the systemoptimization processing section120 is configured by including an audio propertyoptimization processing section142, an image propertyoptimization processing section144, and a device directionoptimization processing section146. Further, thesystem control section122 is configured by including an audioproperty control section152, an imageproperty control section154, and a devicedirection control section156.
The user positioninformation storing section117 stores user position information that is the calculation result of the position of the user with respect to the device center and the frontward axis of theimage display device100 by the user positioninformation calculating section134 of the viewingstate analyzing section116. The user position information stored in the user positioninformation storing section117 is sent to the systemoptimization processing section120.
The audio propertyoptimization processing section142 of the systemoptimization processing section120 calculates audio property control information for performing an audio property optimization process on thespeaker section108 of theimage display device100 based on the user position information sent from the user positioninformation storing section117, so as to optimize the audio property of theimage display device100 for the user at the desired position. The audio property control information calculated by the audio propertyoptimization processing section142 is sent to the audioproperty control section152 of thesystem control section122.
As the audio property optimization process, there are a process for optimizing the volume balance of stereo sound output from thespeaker section108 on the left and right sides, and an optimization process related to a surround effect of the stereo sound output from thespeaker section108. Since a difference is generated in the volume balance of the left and right sides of the stereo sound output from thespeaker section108 depending on the position of the user, the optimization process optimizes gains on the left and right sides. For example, as shown inFIG. 11, a difference from a reference point is calculated as follows according to a principle that the volume of the stereo sound output from thespeaker section108 attenuates in inverse proportion to a square of distance; the difference (gain_dif) in the volume in thespeaker section108 at the left and right sides is calculated, and the volume balance on the left and right sides can be optimized.
gain_Lch=20*log(d_Lch/d_org_LRch)
gain_Rch=20*log(d_Rch/d_org_LRch)
gain_dif=gain_Rch−gain_Lch=20*log(d_Rch)−20*log(d_Lch)=20*log(d_Lch/d_Rch)
Here, note that: gain_Lch
: the gain difference of Lch
gain_Rch
: the gain difference of Rch
d_org_LRch
: the distance from the left and right speaker sections to the reference point
d_Lch
: the distance from the Lch speaker section to the user
d_Rch
: the distance from the Rch speaker section to the user
gain_dif
: voltage gain difference of L/R
Further, if a plurality of users is present, the left and right volume balance of the sound output from thespeaker section108 may be optimized with respect to a center of balance of that user group, or alternatively may optimize the left and right volume balance of the sound output from thespeaker section108 with priority for a particular user. As a method of calculating the center of balance of the user group, when a plurality of users, namely users A, B, C, and D as shown inFIG. 12 is present, the calculation can be performed as follows.
d_cent_dif
= 0;
temp_CAM_DIS
= 0;
if(CAM_AUDIENCE
!= 0){
for(int i=0;i <
CAM_AUDIENCE;i++){
d_cent_dif +=
CAM_DIS[i]* tan(CAM_HOR_ANGLE[i]* PI/180);
temp_CAM_DIS +=
CAM_DIS[i];
  }
 }
d_cent_dif
= d_cent_dif/CAM_AUDIENCE;
// return (use) value center of balance processed angle
CAM_DIS
= temp_CAM_DIS/CAM_AUDIENCE;
// return (use) value center of balance processed distance

Here, note that CAM AUDIENCE
: User within the imaging area of theimaging sections104
CAM_HOR_ANGLE[0]
: Angle of user A
CAM_HOR_ANGLE[1]
: Angle of user B
CAM_HOR_ANGLE[2]
: Angle of user C
CAM_HOR_ANGLE[3]
: Angle of user D
CAM_DIS[0]
: distance of user A
CAM_DIS[1]
: distance of user B
CAM_DIS[2]
: distance of user C
CAM_DIS[3]
: distance of user D
The image propertyoptimization processing section144 of the systemoptimization processing section120 calculates image property control information with respect to the user at the desired position for performing the image property optimization process on thedisplay panel102 of theimage display device100 based on the user position information sent from the user positioninformation storing section117 that is for optimizing the image property of theimage display device100. The image property control information calculated by the image propertyoptimization processing section144 is sent to the imageproperty control section154 of thesystem control section122.
As the image property optimization process, there are processes such as a gamma correction for optimizing an appearance of black, and a correction of color gain for complying with a color change.
For example, the gamma correction is performed as follows.
γ=2.2+image quality correction−0.1×user direction
Further, for example, the correction of the color gain is performed as follows.
ColorGain=User Color+image quality correction±α×user direction
R(G,B) Gain=R(G,B)×image quality correction±α×user direction
Further, if the plurality of users is present, the gamma correction and the correction of the color gain can be performed on the center of balance of the user group, or alternately the gamma correction and the correction of the color gain can be performed with priority for a particular user. In assuming that a position of the user A is [D0, Ah0] and a position of the user B is [D1, Ah1] in the horizontal direction as shown inFIG. 13(A), and a position of the user A is [D0, AV0] and a position of the user B is [D, Av1] in the vertical direction as shown inFIG. 13(B), an average viewing angle correction coefficient and setting data for the system optimization process are calculated by the following formula.
Average viewing angle correction coefficient={(1/D0*(Ah0+Av0)+1/D1*(Ah1+Av1)+1/Dn*(Ahn+Avn))/n}*correction value
Setting data=(basic data)*{1+(correction value at the maximum viewing angle)*(average viewing angle correction coefficient)}
An average viewing angle gives weight to close-by users, and calculates an average of a horizontal angle and a vertical angle for a number of the users. A correction coefficient is calculated by multiplying the correction value to the average viewing angle, and an overall correction amount is calculated by multiplying the correction coefficient to the maximum correction value. The setting data is calculated by adding or subtracting the correction amount to or from the basic data (data with no correction: γ=2.2+image quality correction).
Further, in the optimization of the image property, since the optimal value varies depending on the age and sex of the user, the optimization process can be performed by using the attribute information such as the age and sex obtained from the image processing section in addition to the user position information. For example, a luminance correction of thedisplay panel102 is performed as follows.
BackLight=basic setting value*correction value
Correction value=10^(A*log screen illumination+B*log viewing angle+C*log image average level+D*age)/screen illumination
Further, in an experiment, an optimal luminance is known to have a relationship with the screen illumination, viewing angle, image average luminance level, and age as follows.
Log optimal luminance=A*log screen illumination+B*log viewing angle+C*log image average level+D*age
The device directionoptimization processing section146 of the systemoptimization processing section120 calculates device direction control information for performing a device direction optimization process to themechanism section109 of theimage display device100 with respect to the user at the desired position based on the user position information sent from the user positioninformation storing section117, for optimizing the device direction of theimage display device100. The device direction control information calculated by the device directionoptimization processing section146 is sent to the devicedirection control section156 of thesystem control section122.
As the device direction controlling process, if themechanism section109 is equipped with a swivel mechanism in the vertical and horizontal direction, theimage display device100 is rotated so that the frontward axis [0,0] of theimage display device100 comes to be at the direction [φ1, θ1] of the user. Due to this, thedisplay panel102 of theimage display device100 can be optimized to a direction that is head-on as seen from the user.
The audioproperty control section152 of thesystem control section122 performs the audio property optimization process based on the audio property control information sent from the audio propertyoptimization processing section142. For example, the audioproperty control section152 performs the control of the volume balance of the sound output from thespeaker section108 based on the audio property control information sent from the audio propertyoptimization processing section142.
The imageproperty control section154 of thesystem control section122 performs the image property optimization process based on the image property control information sent from the image propertyoptimization processing section144. For example, the imageproperty control section154 performs the control of the imaging property of thedisplay panel section102 based on the image property control information sent from the image propertyoptimization processing section144.
The devicedirection control section156 of thesystem control section122 performs the device direction optimization process based on the device direction control information sent from the device directionoptimization processing section146. For example, the devicedirection control section156 performs the control of themechanism section109 of theimage display device100 based on the device direction control information sent from the device directionoptimization processing section146.
As above, the configuration of thecontrol section110 included in theimage display device100 of the embodiment of the present invention has been explained with reference toFIG. 3 andFIG. 4. Next, the optimization process in accordance with the position of the user by theimage display device100 of the embodiment of the present invention will be explained.
[1-3. Optimization Process according to User Position]
FIG. 8 is a flow diagram showing an example of the optimization process in accordance with the position of the user by theimage display device100 of the embodiment of the present invention. Hereinbelow, the optimization process in accordance with the position of the user by theimage display device100 of the embodiment of the present invention will be explained with reference toFIG. 8.
InFIG. 8, firstly, when theimaging sections104 of theimage display device100 starts taking an image, theimage input section112 of thecontrol unit110 receives the image taken by the imaging sections104 (step S802).
Then, theimage processing section114 of thecontrol section110 performs the process of detecting the face included in the image received by the image input section112 (step S804).
Then, the viewingstate analyzing section116 of thecontrol section110 calculates the relative position of the user with respect to the optical axis of the camera of therespective imaging sections104 in the user direction/distance calculating section and calculates the user position with respect to the device center and the frontward axis of theimage display device100 in the user position information calculating section134 (step S806).
Then, the systemoptimization processing section120 of thecontrol section110 calculates system control information for performing the system optimization process for optimizing a state of theimage display device100 with respect to the user at the desired position based on the user position information calculated in step S806 (step S808). For example, in step S808, the system control information for performing the optimization of the volume balance of the sound output from thespeaker section108 on the left and right sides is calculated. Further, in step S808, the system control information for performing the process such as the gamma correction for optimizing the appearance of black and the correction of the color gain for complying with the color change is calculated. Further, in step S808, the system control information for performing the process for optimizing the device direction of theimage display device100 is calculated.
Then, thesystem control section122 of thecontrol section110 performs the system optimization process based on the system control information calculated in step S808 (step S810), and ends this process.
By the optimization process according to the position of the user inFIG. 8, the state of theimage display device100 with respect to the user at the desired position can be optimized. For example, the left and right volume balance of the sound output from thespeaker section108 is optimized, so the user can watch theimage display device100 without feeling uncomfortable. Further, the appearance of black and the color change are optimized, so the user can satisfactorily watch the image displayed on theimage display device100. Further, theimage display device100 is optimized to the direction that is head-on to the user, so the user can satisfactorily watch the image displayed on theimage display device100.
[1-4. Optimization Process according to User Positions of One or More Users]
Next, an optimization process in accordance with positions of one or more users by theimage display device100 of the embodiment of the present invention will be explained.FIG. 9 is a flow diagram showing an example of an optimization process in accordance with the positions of the one or more users by theimage display device100 of the embodiment of the present invention.
InFIG. 9, firstly, when theimaging sections104 of theimage display device100 starts taking an image, theimage input section112 of thecontrol unit110 receives the image taken by theimaging section104, and theimage processing section114 of thecontrol section110 performs the detection process of the face included in the image received by theimage input section112 and the like (step S902).
Then, the viewingstate analyzing section116 of thecontrol section110 receives a result of the face detection process by theimage processing section114, and determines whether a detected number of the user is one or more by using the result of the face detection by the image processing section114 (step S904).
As a result of the determination of step S904, if the detected number of the user is one, the viewingstate analyzing section116 of thecontrol section110 calculates a horizontal angle and a vertical angle of the user (step S906).
Then, the systemoptimization processing section120 of thecontrol section110 calculates a correction coefficient for the system control information for performing the system optimization process based on the horizontal angle and the vertical angle of the user calculated in step S906 (step S908).
As a result of the determination of step S904, if the detected number of the user is a plurality, the viewingstate analyzing section116 of thecontrol section110 determines whether the plurality of users is at a center of the image or not (step S910).
As a result of the determination of step S910, if the plurality of users is not at the center of the image (NO at step S910), the viewingstate analyzing section116 of thecontrol section110 calculates the horizontal angle, the vertical angle and a distance for respective ones of the plurality of users, and averages them (step S912). Further, in step S912, the horizontal angle, the vertical angle, and a distance may be calculated for respective ones of the plurality of users, and positions of center of balance for the plurality of users may be calculated.
Then, the systemoptimization processing section120 of thecontrol section110 calculates the correction coefficient for the system control information for performing the system optimization process (step S914).
As a result of the determination of step S910, if the plurality of users is at the center of the image (YES at step S910), the systemoptimization processing section120 of thecontrol section110 calculates the correction coefficient without the correction coefficient for the system control information for the system optimization process, or by changing a weight thereon (step S916).
After having calculated the correction coefficients in steps S908, S914, and S916, the systemoptimization processing section120 of thecontrol section110 calculates the system control information by adding the correction coefficients to basic data of the system control information for the system optimization process (step S918), and ends the process.
By the optimization process according to the positions of the one or more users inFIG. 9, the state of theimage display device100 with respect to the plurality of users at the desired positions can be optimized.
[1-5. Optimization Process according to Ages of One or More Users]
Next, an optimization process in accordance with ages of the one or more users by theimage display device100 of the embodiment of the present invention will be explained.FIG. 10 is a flow diagram showing an example of the optimization process in accordance with the ages of the one or more users by theimage display device100 of the embodiment of the present invention.
InFIG. 10, firstly, when theimaging sections104 of theimage display device100 starts taking an image, theimage input section112 of thecontrol unit110 receives the image taken by theimaging section104, and theimage processing section114 of thecontrol section110 performs the detection process of the faces included in the image received by theimage input section112 and the like (step S1002).
Then, the viewingstate analyzing section116 of thecontrol section110 receives a result of the face detection process by theimage processing section114, and determines whether a detected number of the user is one or more by using the result of the face detection by the image processing section114 (step S1004).
As a result of the determination of step S1004, if the detected number of the user is one, the viewingstate analyzing section116 of thecontrol section110 analyzes the age of the user (step S1006).
Then, the systemoptimization processing section120 of thecontrol section110 calculates a correction coefficient for the system control information for the system optimization process based on an analysis result of the age in step S1006 (step S1008).
As a result of the determination of step S1004, if the detected number of the user is a plurality, the viewingstate analyzing section116 of thecontrol section110 analyzes the ages of the respective users (step S1010).
Then, the viewingstate analyzing section116 of thecontrol section110 calculates a correction coefficient without the correction of the system control information for the system optimization process, or calculates correction coefficients independently, and averages the same (step S1012).
After steps S1008, S1012, the systemoptimization processing section120 of thecontrol section110 calculates the system control information by adding the correction coefficient to the basic data of the system control information for the system optimization process (step S1014), and ends the process.
By the optimization process according to the ages of the one or more users inFIG. 10, the state of theimage display device100 with respect to the plurality of users of various ages can be optimized.
Further, in the embodiment, the systemoptimization processing section120 may calculate system control information for optimizing a letter size of a GUI displayed on thedisplay panel section102 based on the user position information. The system control information calculated by the systemoptimization processing section120 is sent to thesystem control section122, and the optimization process for optimizing the letter size of the GUI displayed on thedisplay panel section102 is performed by thesystem control section122.
As the optimization process of the letter size of the GUI, as shown inFIG. 14(A), there is a process to enlarge the letter size of the GUI displayed on thedisplay panel section102 when the user approaches theimage display device100. In this case, the letter size of the GUI is enlarged when the user approaches theimage display device100, and the letter size of the GUI is reduced when the user backs away from theimage display device100. For example, when the letter size of the GUI is small and causing difficulty in reading, the letter size of the GUI increases as the user approaches, whereby the user can easily recognize the GUI.
Further, as the optimization process of the letter size of the GUI, as shown inFIG. 14(B), there is a process to refine information to be displayed, that is, to increase information amount, by reducing the letter size of the GUI displayed on thedisplay panel section102 when the user approaches theimage display device100. In this case, the letter size of the GUI is reduced and the information to be displayed is increased when the user approaches theimage display device100, and the letter size of the GUI is increased when the user backs away from theimage display device100. For example, if the user is far away when a program list is displayed on thedisplay panel section102, the amount of information is reduced by enlarging the letter size, and when the user approaches, the amount of information is increased by reducing the letter size.
Further, in the present embodiment, as shown inFIGS. 15(A) to 15(C), in the calculation of the viewing positions of the users, the variation in the face size may be corrected by using the reference face size [w0, h0] at the reference distance d0 in a following correction table. For example, from the attribute information such as the user's age, a data table of an average face size of that age is predeterminedly stored; for example, if the user is a child, the reference face size [w0, h0] may be made to be a face size [w0C, h0C] that is smaller than the reference face size shown inFIG. 15(C), and if the user is an adult, the reference face size [w0, h0] may be made to be a face size [w0A, h0A] that is larger than the reference face size shown inFIG. 15(B).
Further, in the present embodiment, in the calculation of the viewing position of the user, in predeterminedly registering the user who uses theimage display device100 in theimage display device100, for example a family of an installation site of theimage display device100, the face size of each user may be registered as a data table. Due to this, the reference face size can be changed for each user. A method to register the face size for each user can be realized by taking images together with distance information in cooperation with another distance sensor (not shown), taking images by guiding the user to a predetermined distance, or taking images at the same distance as a scale to be the reference.
Further, in the present embodiment, even if the user is out of the imaging area of theimaging section104 by estimating the position of the user outside the imaging area from a chronological transition information, the aforementioned system optimization process can be continued.
Further, in the present embodiment, in the system optimization process, an appropriate time constant may be set according to a viewing environment of the user. Due to this, even if the user performs a steep positional transition, the system optimization process can be continued.
Note that, the aforementioned series of processes may be performed by hardware, or may be performed by software. In having the software to perform the series of processes, a program configuring that software is installed from a program storing medium to a computer installed in a dedicated hardware, or for example to a general purpose personal computer that is capable of performing various functions by installing various programs.
The preferred embodiment of the present invention has been described above with reference to the appended drawings, but the present invention is not limited thereto. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
REFERENCE SIGNS LIST
  • 100 Image display device
  • 102 Display panel section
  • 104 Imaging section
  • 106 Sensor section
  • 108 Speaker section
  • 109 Mechanism section
  • 110 Control section
  • 112 Image input section
  • 114 Image processing section
  • 116 Viewing state analyzing section
  • 117 User position information storing section
  • 118 Viewing state recording section
  • 120 System optimization processing section
  • 122 System control section
  • 132 User direction/distance calculating section
  • 134 User position information calculating section
  • 142 Audio property optimization processing section
  • 144 Image property optimization processing section
  • 146 Device direction optimization processing section
  • 152 Audio property control section
  • 154 Image property control section
  • 156 Device direction control section
  • 200 Data collection server

Claims (18)

The invention claimed is:
1. An information processing device, comprising:
a sensor that detects a user within a detectable range from the sensor; and
processing circuitry that
detects a face of the user based on image data captured by the sensor;
calculates system control information for optimizing an output of a display system based on a respective position of the user detected by the sensor;
analyzes the face of the user to determine an age of the user;
calculates a correction coefficient to correct the system control information according to the age of the user;
corrects the system control information based on the correction coefficient; and
optimizes the output of the display system based on the corrected system control information.
2. The information processing device according toclaim 1, wherein the processing circuitry calculates the system control information to optimize a sound output of the display system based on the respective position of the user detected by the sensor.
3. The information processing device according toclaim 2, wherein the processing circuitry calculates the system control information to optimize a volume balance of the sound output from an audio output of the display system based on the respective position of the user detected by the sensor.
4. The information processing device according toclaim 2, wherein the processing circuitry calculates the system control information to adjust an output direction of the sound output of the display system based on the respective position of the user detected by the sensor.
5. The information processing device according toclaim 1, wherein
the sensor detects a plurality of users within the detectable range from the sensor, and
the processing circuitry calculates the system control information to optimize the output of the display system based on respective positions of each of the plurality of users detected by the sensor.
6. The information processing device according toclaim 1, wherein the processing circuitry calculates the system control information to optimize an image property of the output of the display system based on the respective position of the user detected by the sensor.
7. The information processing device according toclaim 1, wherein the processing circuitry calculates the system control information to optimize a display content that is output by the display system based on the respective position of the user detected by the sensor.
8. The information processing device according toclaim 1, wherein
the processing circuitry calculates the system control information to modify a display direction of the display system based on the respective position of the user detected by the sensor, and
the display system modifies the display direction of the display system according to the system control information.
9. The information processing device according toclaim 1, wherein the processing circuitry further
calculates a position of the user according the face of the user, and
calculates the system control information for optimizing the output of the display system based on calculated position of the user.
10. The information processing device according toclaim 1, wherein the processing circuitry, to correct the system control information based on the correction coefficient, adjusts a size of the output of the display system.
11. The information processing device according toclaim 1, wherein the processing circuitry, to correct the system control information based on the correction coefficient, adjusts a brightness of the output of the display system.
12. The information processing device according toclaim 1, wherein the processing circuitry further
calculates the system control information to optimize a sound output of the display system based on the respective position of the user detected by the sensor, and
adjusts a volume of the sound output of the display system to correct the system control information based on the correction coefficient.
13. The information processing device according toclaim 1, wherein
the output of the display system includes a graphical user interface, and
the processing circuitry, to correct the system control information based on the correction coefficient, adjusts a position of the graphical user interface output by the display system.
14. The information processing device according toclaim 1, wherein
the output of the display system includes a graphical user interface, and
the processing circuitry, to correct the system control information based on the correction coefficient, adjusts a brightness of the graphical user interface output by the display system.
15. The information processing device according toclaim 1, wherein the processing circuitry sets a time constant based on the respective position of the user.
16. A controlling method, comprising:
detecting, by a sensor, a user within a detectable range from the sensor;
detecting a face of the user based on image data captured by the sensor;
calculating, by processing circuitry, system control information to optimize an output of a display system based on a respective position of the user detected by the sensor;
analyzing the face of the user to determine an age of the user;
calculating a correction coefficient to correct the system control information according to the age of the user;
correcting, by the processing circuitry, the system control information based on the correction coefficient; and
optimizing, by the processing circuitry, the output of the display system based on the corrected system control information.
17. The controlling method according toclaim 16, further comprising:
setting, by the processing circuitry, a time constant based on the respective position of the user.
18. A non-transitory computer readable medium storing computer readable instructions that, when executed by a processor, cause the processor to:
control a sensor to detect a user within a detectable range from the sensor;
detect a face of the user based on image data captured by the sensor;
calculate system control information to optimize an output of a display system based on a respective position of the user detected by the sensor;
analyze the face of the user to determine an age of the user;
calculate a correction coefficient to correct the system control information according to the age of the user;
correct the system control information based on the correction coefficient; and
optimize the output of the display system based on the corrected system control information.
US14/589,1072009-09-152015-01-05Display device and controlling methodActive2030-08-22US9489043B2 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US14/589,107US9489043B2 (en)2009-09-152015-01-05Display device and controlling method

Applications Claiming Priority (5)

Application NumberPriority DateFiling DateTitle
JP2009-2133772009-09-15
JP2009213377AJP5568929B2 (en)2009-09-152009-09-15 Display device and control method
PCT/JP2010/062311WO2011033855A1 (en)2009-09-152010-07-22Display device and control method
US201213395035A2012-06-212012-06-21
US14/589,107US9489043B2 (en)2009-09-152015-01-05Display device and controlling method

Related Parent Applications (2)

Application NumberTitlePriority DateFiling Date
US13/395,035ContinuationUS8952890B2 (en)2009-09-152010-07-22Display device and controlling method
PCT/JP2010/062311ContinuationWO2011033855A1 (en)2009-09-152010-07-22Display device and control method

Publications (2)

Publication NumberPublication Date
US20150185830A1 US20150185830A1 (en)2015-07-02
US9489043B2true US9489043B2 (en)2016-11-08

Family

ID=43758468

Family Applications (2)

Application NumberTitlePriority DateFiling Date
US13/395,035Active2031-08-09US8952890B2 (en)2009-09-152010-07-22Display device and controlling method
US14/589,107Active2030-08-22US9489043B2 (en)2009-09-152015-01-05Display device and controlling method

Family Applications Before (1)

Application NumberTitlePriority DateFiling Date
US13/395,035Active2031-08-09US8952890B2 (en)2009-09-152010-07-22Display device and controlling method

Country Status (8)

CountryLink
US (2)US8952890B2 (en)
EP (1)EP2472863A4 (en)
JP (1)JP5568929B2 (en)
KR (1)KR101784754B1 (en)
CN (1)CN102687522B (en)
BR (1)BR112012005231A2 (en)
RU (1)RU2553061C2 (en)
WO (1)WO2011033855A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11011095B2 (en)*2018-08-312021-05-18Chongqing Hkc Optoelectronics Technology Co., Ltd.Display panel, and image control device and method thereof

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9330589B2 (en)*2011-11-162016-05-03Nanolumens Acquisition, Inc.Systems for facilitating virtual presence
JP5263092B2 (en)2009-09-072013-08-14ソニー株式会社 Display device and control method
JP5418093B2 (en)2009-09-112014-02-19ソニー株式会社 Display device and control method
JP2013031013A (en)*2011-07-282013-02-07Toshiba CorpElectronic device, control method of electronic device, and control program of electronic device
CN102314848A (en)*2011-09-092012-01-11深圳Tcl新技术有限公司Backlight-control method and system for liquid-crystal display device
JP5892797B2 (en)*2012-01-202016-03-23日本放送協会 Transmission / reception system, transmission / reception method, reception apparatus, and reception method
KR20130117525A (en)*2012-04-182013-10-28삼성디스플레이 주식회사Image display system and driving method thereof
WO2014006757A1 (en)*2012-07-062014-01-09Necディスプレイソリューションズ株式会社Display device, and control method for display device
US9412375B2 (en)2012-11-142016-08-09Qualcomm IncorporatedMethods and apparatuses for representing a sound field in a physical space
JP6058978B2 (en)*2012-11-192017-01-11サターン ライセンシング エルエルシーSaturn Licensing LLC Image processing apparatus, image processing method, photographing apparatus, and computer program
CN104798129B (en)*2012-11-272018-10-19索尼公司Display device, display methods and computer-readable medium
US20140153753A1 (en)*2012-12-042014-06-05Dolby Laboratories Licensing CorporationObject Based Audio Rendering Using Visual Tracking of at Least One Listener
US9648413B2 (en)*2013-02-052017-05-09Toa CorporationLoudspeaker system
WO2014126991A1 (en)*2013-02-132014-08-21Vid Scale, Inc.User adaptive audio processing and applications
US10139925B2 (en)*2013-03-042018-11-27Microsoft Technology Licensing, LlcCausing specific location of an object provided to a device
DE102013206569B4 (en)*2013-04-122020-08-06Siemens Healthcare Gmbh Gesture control with automated calibration
US20160334884A1 (en)*2013-12-262016-11-17Interphase CorporationRemote Sensitivity Adjustment in an Interactive Display System
CN104298347A (en)*2014-08-222015-01-21联发科技(新加坡)私人有限公司Method and device for controlling screen of electronic display device and display system
WO2016072128A1 (en)*2014-11-042016-05-12ソニー株式会社Information processing device, communication system, information processing method, and program
JP2016092765A (en)*2014-11-112016-05-23株式会社リコーInformation processing device, user detection method and program
CN106713793A (en)*2015-11-182017-05-24天津三星电子有限公司Sound playing control method and device thereof
CN109154979A (en)*2016-10-262019-01-04奥康科技有限公司For analyzing image and providing the wearable device and method of feedback
WO2018155354A1 (en)*2017-02-212018-08-30パナソニックIpマネジメント株式会社Electronic device control method, electronic device control system, electronic device, and program
EP3396226B1 (en)*2017-04-272023-08-23Advanced Digital Broadcast S.A.A method and a device for adjusting a position of a display screen
CN107632708B (en)*2017-09-222021-08-17京东方科技集团股份有限公司Control method and control device for visual angle of screen and flexible display device
KR102803508B1 (en)*2018-09-042025-05-08삼성전자주식회사Display apparatus and method for controlling thereof
US11032508B2 (en)*2018-09-042021-06-08Samsung Electronics Co., Ltd.Display apparatus and method for controlling audio and visual reproduction based on user's position
CN110503891A (en)*2019-07-052019-11-26太仓秦风广告传媒有限公司A kind of electronic bill-board transform method and its system based on distance change
JP2021015203A (en)*2019-07-122021-02-12富士ゼロックス株式会社Image display device, image forming apparatus, and program
CN112073804B (en)*2020-09-102022-05-20深圳创维-Rgb电子有限公司Television sound adjusting method, television and storage medium
TW202333491A (en)*2021-12-282023-08-16日商索尼集團公司 Information processing device, information processing method, and program
CN114999335B (en)*2022-06-102023-08-15长春希达电子技术有限公司LED spliced screen seam repairing method based on ultra-wideband and one-dimensional envelope peak value
CN115914949A (en)*2022-08-312023-04-04深圳市当智科技有限公司 Sound effect compensation method, projector and storage medium

Citations (41)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4973149A (en)1987-08-191990-11-27Center For Innovative TechnologyEye movement detector
JPH05137200A (en)1991-11-141993-06-01Sony CorpAutomatic adjustment device for stereo sound volume balance
US5258586A (en)1989-03-201993-11-02Hitachi, Ltd.Elevator control system with image pickups in hall waiting areas and elevator cars
JPH09247564A (en)1996-03-121997-09-19Hitachi Ltd Television receiver
US6084367A (en)1996-04-022000-07-04Landert; HeinrichMethod of operating a door system and a door system operating by this method
US6215471B1 (en)1998-04-282001-04-10Deluca Michael JosephVision pointer method and apparatus
US6243076B1 (en)1998-09-012001-06-05Synthetic Environments, Inc.System and method for controlling host system interface with point-of-interest data
US20010012001A1 (en)1997-07-072001-08-09Junichi RekimotoInformation input apparatus
US20020105482A1 (en)2000-05-262002-08-08Lemelson Jerome H.System and methods for controlling automatic scrolling of information on a display or screen
US6611297B1 (en)*1998-04-132003-08-26Matsushita Electric Industrial Co., Ltd.Illumination control method and illumination device
EP1426919A1 (en)2002-12-022004-06-09Sony International (Europe) GmbHMethod for operating a display device
JP2005044330A (en)2003-07-242005-02-17Univ Of California San Diego Weak hypothesis generation apparatus and method, learning apparatus and method, detection apparatus and method, facial expression learning apparatus and method, facial expression recognition apparatus and method, and robot apparatus
US6863609B2 (en)2000-08-112005-03-08Konami CorporationMethod for controlling movement of viewing point of simulated camera in 3D video game, and 3D video game machine
EP1566788A2 (en)2004-01-232005-08-24Sony United Kingdom LimitedDisplay
US20050253807A1 (en)2004-05-112005-11-17Peter HohmannMethod for displaying information and information display system
US20060139314A1 (en)2002-05-282006-06-29Matthew BellInteractive video display system
US7117380B2 (en)2003-09-302006-10-03International Business Machines CorporationApparatus, system, and method for autonomic power adjustment in an electronic device
US20060227103A1 (en)2005-04-082006-10-12Samsung Electronics Co., Ltd.Three-dimensional display device and method using hybrid position-tracking system
JP2007065766A (en)2005-08-292007-03-15Sony CorpImage processor and method, and program
US20070126884A1 (en)2005-12-052007-06-07Samsung Electronics, Co., Ltd.Personal settings, parental control, and energy saving control of television with digital video camera
US20080088437A1 (en)2005-05-062008-04-17Omnilink Systems, Inc.System and method for monitoring alarms and responding to the movement of individuals and assets
US20080118152A1 (en)2006-11-202008-05-22Sony Ericsson Mobile Communications AbUsing image recognition for controlling display lighting
JP2008172817A (en)2008-02-182008-07-24Seiko Epson Corp Control system and controlled device adapted to this system
JP2009094723A (en)2007-10-052009-04-30Mitsubishi Electric CorpTelevision receiver
US20090251458A1 (en)2008-04-072009-10-08Sony CorporationImage signal generating apparatus, image signal generation method, computer program, and recording medium
US7627139B2 (en)2002-07-272009-12-01Sony Computer Entertainment Inc.Computer image and audio processing of intensity and input devices for interfacing with a computer program
US20090315869A1 (en)2008-06-182009-12-24Olympus CorporationDigital photo frame, information processing system, and control method
US7834912B2 (en)2006-04-192010-11-16Hitachi, Ltd.Attention level measuring apparatus and an attention level measuring system
US20110116685A1 (en)*2009-11-162011-05-19Sony CorporationInformation processing apparatus, setting changing method, and setting changing program
US20110135114A1 (en)2008-08-222011-06-09Sony CorporationImage display device, control method and computer program
US7986424B2 (en)2005-03-112011-07-26Brother Kogyo Kabushiki KaishaLocation-based information
US20110237324A1 (en)*2010-03-292011-09-29Microsoft CorporationParental control settings based on body dimensions
US20110254691A1 (en)2009-09-072011-10-20Sony CorporationDisplay device and control method
US8085243B2 (en)2006-02-032011-12-27Panasonic CorporationInput device and its method
US20120114137A1 (en)2010-11-052012-05-10Shingo TsurumiAcoustic Control Apparatus and Acoustic Control Method
US8199108B2 (en)2002-12-132012-06-12Intellectual Ventures Holding 67 LlcInteractive directed light/sound system
US8230367B2 (en)2007-09-142012-07-24Intellectual Ventures Holding 67 LlcGesture-based user interactions with status indicators for acceptable inputs in volumetric zones
US8233033B2 (en)*2003-12-182012-07-31Tp Vision Holding B.V.Supplementary visual display system
US20120206340A1 (en)2009-09-112012-08-16Sony CorporationDisplay method and display apparatus
US8400322B2 (en)2009-03-172013-03-19International Business Machines CorporationApparatus, system, and method for scalable media output
US9288387B1 (en)*2012-09-112016-03-15Amazon Technologies, Inc.Content display controls based on environmental factors

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6076928A (en)*1998-06-152000-06-20Fateh; SinaIdeal visual ergonomic system for computer users
KR100312717B1 (en)*1998-06-162002-02-28윤종용 Automatic screen and volume control based on the viewer's location
US8123616B2 (en)*2003-03-252012-02-28IgtMethods and apparatus for limiting access to games using biometric data
JP4734855B2 (en)*2004-06-232011-07-27株式会社日立製作所 Information processing device
RU2370817C2 (en)*2004-07-292009-10-20Самсунг Электроникс Ко., Лтд.System and method for object tracking
JP4107288B2 (en)*2004-12-102008-06-25セイコーエプソン株式会社 Control system, controlled apparatus and remote control apparatus compatible with this system
JP4225307B2 (en)*2005-09-132009-02-18船井電機株式会社 Television receiver
JP2008301167A (en)*2007-05-312008-12-11Sharp CorpLiquid crystal television receiver
JP5564946B2 (en)*2007-09-202014-08-06日本電気株式会社 Video providing system and video providing method
EP2597868B1 (en)*2007-09-242017-09-13Qualcomm IncorporatedEnhanced interface for voice and video communications
WO2009067676A1 (en)*2007-11-212009-05-28Gesturetek, Inc.Device access control
CN101925916B (en)*2007-11-212013-06-19高通股份有限公司Method and system for controlling electronic device based on media preferences

Patent Citations (43)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4973149A (en)1987-08-191990-11-27Center For Innovative TechnologyEye movement detector
US5258586A (en)1989-03-201993-11-02Hitachi, Ltd.Elevator control system with image pickups in hall waiting areas and elevator cars
JPH05137200A (en)1991-11-141993-06-01Sony CorpAutomatic adjustment device for stereo sound volume balance
JPH09247564A (en)1996-03-121997-09-19Hitachi Ltd Television receiver
US6084367A (en)1996-04-022000-07-04Landert; HeinrichMethod of operating a door system and a door system operating by this method
US20010012001A1 (en)1997-07-072001-08-09Junichi RekimotoInformation input apparatus
US6611297B1 (en)*1998-04-132003-08-26Matsushita Electric Industrial Co., Ltd.Illumination control method and illumination device
US6215471B1 (en)1998-04-282001-04-10Deluca Michael JosephVision pointer method and apparatus
US6243076B1 (en)1998-09-012001-06-05Synthetic Environments, Inc.System and method for controlling host system interface with point-of-interest data
US20020105482A1 (en)2000-05-262002-08-08Lemelson Jerome H.System and methods for controlling automatic scrolling of information on a display or screen
US6863609B2 (en)2000-08-112005-03-08Konami CorporationMethod for controlling movement of viewing point of simulated camera in 3D video game, and 3D video game machine
US20060139314A1 (en)2002-05-282006-06-29Matthew BellInteractive video display system
US7627139B2 (en)2002-07-272009-12-01Sony Computer Entertainment Inc.Computer image and audio processing of intensity and input devices for interfacing with a computer program
EP1426919A1 (en)2002-12-022004-06-09Sony International (Europe) GmbHMethod for operating a display device
US20040160386A1 (en)2002-12-022004-08-19Georg MichelitschMethod for operating a display device
US8199108B2 (en)2002-12-132012-06-12Intellectual Ventures Holding 67 LlcInteractive directed light/sound system
JP2005044330A (en)2003-07-242005-02-17Univ Of California San Diego Weak hypothesis generation apparatus and method, learning apparatus and method, detection apparatus and method, facial expression learning apparatus and method, facial expression recognition apparatus and method, and robot apparatus
US7117380B2 (en)2003-09-302006-10-03International Business Machines CorporationApparatus, system, and method for autonomic power adjustment in an electronic device
US8233033B2 (en)*2003-12-182012-07-31Tp Vision Holding B.V.Supplementary visual display system
US20050197923A1 (en)2004-01-232005-09-08Kilner Andrew R.Display
EP1566788A2 (en)2004-01-232005-08-24Sony United Kingdom LimitedDisplay
US20050253807A1 (en)2004-05-112005-11-17Peter HohmannMethod for displaying information and information display system
US7986424B2 (en)2005-03-112011-07-26Brother Kogyo Kabushiki KaishaLocation-based information
US20060227103A1 (en)2005-04-082006-10-12Samsung Electronics Co., Ltd.Three-dimensional display device and method using hybrid position-tracking system
US20080088437A1 (en)2005-05-062008-04-17Omnilink Systems, Inc.System and method for monitoring alarms and responding to the movement of individuals and assets
JP2007065766A (en)2005-08-292007-03-15Sony CorpImage processor and method, and program
US20070126884A1 (en)2005-12-052007-06-07Samsung Electronics, Co., Ltd.Personal settings, parental control, and energy saving control of television with digital video camera
US8085243B2 (en)2006-02-032011-12-27Panasonic CorporationInput device and its method
US7834912B2 (en)2006-04-192010-11-16Hitachi, Ltd.Attention level measuring apparatus and an attention level measuring system
US20080118152A1 (en)2006-11-202008-05-22Sony Ericsson Mobile Communications AbUsing image recognition for controlling display lighting
US8230367B2 (en)2007-09-142012-07-24Intellectual Ventures Holding 67 LlcGesture-based user interactions with status indicators for acceptable inputs in volumetric zones
JP2009094723A (en)2007-10-052009-04-30Mitsubishi Electric CorpTelevision receiver
JP2008172817A (en)2008-02-182008-07-24Seiko Epson Corp Control system and controlled device adapted to this system
US20090251458A1 (en)2008-04-072009-10-08Sony CorporationImage signal generating apparatus, image signal generation method, computer program, and recording medium
US20090315869A1 (en)2008-06-182009-12-24Olympus CorporationDigital photo frame, information processing system, and control method
US20110135114A1 (en)2008-08-222011-06-09Sony CorporationImage display device, control method and computer program
US8400322B2 (en)2009-03-172013-03-19International Business Machines CorporationApparatus, system, and method for scalable media output
US20110254691A1 (en)2009-09-072011-10-20Sony CorporationDisplay device and control method
US20120206340A1 (en)2009-09-112012-08-16Sony CorporationDisplay method and display apparatus
US20110116685A1 (en)*2009-11-162011-05-19Sony CorporationInformation processing apparatus, setting changing method, and setting changing program
US20110237324A1 (en)*2010-03-292011-09-29Microsoft CorporationParental control settings based on body dimensions
US20120114137A1 (en)2010-11-052012-05-10Shingo TsurumiAcoustic Control Apparatus and Acoustic Control Method
US9288387B1 (en)*2012-09-112016-03-15Amazon Technologies, Inc.Content display controls based on environmental factors

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Combined Chinese Office Action and Search Report issued Jan. 14, 2015 in Patent Application No. 201080048006.6 (with English Translation).
European Office Action issued May 4, 2016 in European Application No. 10 816 968.1 (9 pages).
Extended European Search Report issued Jan. 8, 2013 in Patent Application No. 10816968.1.
International Search Report issued on Aug. 17, 2010 in PCT/JP10/062311 filed on Jul. 22, 2010.
Japanese Office Action issued Mar. 19, 2013 in Patent Application No. 2009-213377.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11011095B2 (en)*2018-08-312021-05-18Chongqing Hkc Optoelectronics Technology Co., Ltd.Display panel, and image control device and method thereof

Also Published As

Publication numberPublication date
JP5568929B2 (en)2014-08-13
RU2553061C2 (en)2015-06-10
CN102687522A (en)2012-09-19
EP2472863A1 (en)2012-07-04
EP2472863A4 (en)2013-02-06
WO2011033855A1 (en)2011-03-24
US20120293405A1 (en)2012-11-22
US8952890B2 (en)2015-02-10
BR112012005231A2 (en)2020-08-04
US20150185830A1 (en)2015-07-02
RU2012108872A (en)2013-09-20
JP2011066516A (en)2011-03-31
KR101784754B1 (en)2017-10-16
CN102687522B (en)2015-08-19
KR20120082406A (en)2012-07-23

Similar Documents

PublicationPublication DateTitle
US9489043B2 (en)Display device and controlling method
US20230113885A1 (en)Electronic device for stabilizing image and method for operating same
US8743099B2 (en)Display adjusting device and display adjusting method
US8913007B2 (en)Display apparatus and control method
EP3136826B1 (en)Information processing device, information processing method and program
US20110235807A1 (en)Audio output device
KR20090063679A (en) Image display device with pointing function and method
US20110267542A1 (en)Image processing apparatus and control method thereof
US20250080864A1 (en)Image processing method and electronic device
CN106063288B (en) Display device and channel map management method thereof
US20190182434A1 (en)Method of providing image and electronic device for supporting the method
US20230319393A1 (en)Exposure parameter adjustment method, device and storage medium
TW201910995A (en)Methods for adjusting panel brightness and brightness adjustment system
CN104097574A (en)Image display method and system
US12190771B2 (en)Display system and display method
JP2009077156A (en)Image display device, image display method, and remote controller
US10687035B2 (en)Projection method and projection system
US11741898B1 (en)Power management for global mode display panel illumination
US10636384B2 (en)Image processing apparatus and image processing method
US12020655B2 (en)Method for controlling display apparatus, display apparatus, device, and computer storage medium
JP2010066605A (en)Image display device and image quality adjustment method
CN119027632A (en) Electronic device with supplementary transmitter for tracking external objects
JP2011002727A (en)Image display device
TW201717179A (en)Curved display device and method for compensating the brightness of images

Legal Events

DateCodeTitleDescription
STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:SATURN LICENSING LLC, NEW YORK

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONY CORPORATION;REEL/FRAME:043177/0794

Effective date:20170613

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp