Movatterモバイル変換


[0]ホーム

URL:


CN113946233A - Display control device and display control method - Google Patents

Display control device and display control method
Download PDF

Info

Publication number
CN113946233A
CN113946233ACN202010691620.XACN202010691620ACN113946233ACN 113946233 ACN113946233 ACN 113946233ACN 202010691620 ACN202010691620 ACN 202010691620ACN 113946233 ACN113946233 ACN 113946233A
Authority
CN
China
Prior art keywords
user
display control
vehicle
state
detection unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010691620.XA
Other languages
Chinese (zh)
Inventor
吴桐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alps Alpine Co Ltd
Original Assignee
Alps Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alps Electric Co LtdfiledCriticalAlps Electric Co Ltd
Priority to CN202010691620.XApriorityCriticalpatent/CN113946233A/en
Publication of CN113946233ApublicationCriticalpatent/CN113946233A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

目的是提供一种在选择项目的选择时抑制用户感到麻烦的情况的“显示控制装置及显示控制方法”。显示控制装置(3)的显示控制部(17)当将选择项目显示在触控屏(4L、4R)时,在由状态检测部(16)检测出用户是规定的状态的情况下,将鉴于用户是规定的状态而被推测为与用户的希望对应的选择项目显示在用户容易选择的位置。由此,构建出用户选择的可能性较高的选择项目被显示在容易选择的位置的状态。

Figure 202010691620

An object is to provide a "display control device and display control method" that suppresses the user's inconvenience when selecting a selection item. When the display control unit (17) of the display control device (3) displays the selection item on the touch panel (4L, 4R), when the state detection unit (16) detects that the user is in a predetermined state, it takes into account The user is in a predetermined state, and the selection item presumably corresponding to the user's desire is displayed at a position where the user can easily select. Thereby, a state is constructed in which the selection item which is highly likely to be selected by the user is displayed at a position where it is easy to select.

Figure 202010691620

Description

Display control device and display control method
Technical Field
The present invention relates to a display control device and a display control method, and more particularly to a display control device and a display control method for displaying selection items selectable by a user on a display unit.
Background
Conventionally, the following systems have been provided: a plurality of selection items are displayed in a list in a predetermined order on a display unit, and if a user selects a selection item according to his or her own desire, a screen corresponding to the desire is newly displayed or a process corresponding to the desire is executed. For example, the following systems are provided: in a vehicle interior space formed in a vehicle interior, a touch panel is provided at a seat other than a driver seat such as a front passenger seat or a rear seat, and a rider can control a control target mounted in the vehicle such as an air conditioner or an audio device by operating the touch panel. In this system, normally, selection items for each control object are displayed in a list in a predetermined order on a home screen of a touch panel, and if a user selects a certain selection item according to his or her own desire, a detailed screen of the control object corresponding to the selected selection item is displayed.
Further,patent document 1 describes an in-vehicle information display device including: the display amount of information displayed on the display unit is reduced as the vehicle travels faster, and information considered important to the user, such as information including keywords, is preferentially displayed as the display amount is reduced. Further, patent document 2 describes the following apparatus: the operation restriction of the vehicle device is released only when the LED that can be recognized by only the fellow passenger is lit in a predetermined lighting pattern during traveling of the vehicle and the lighting pattern matches the operation state of the operation switch.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2006-
Patent document 2: japanese patent laid-open publication No. 2013-107576
Disclosure of Invention
Problems to be solved by the invention
However, in a system in which a plurality of selection items are displayed in a list in a predetermined order on a display unit and a user selects a selection item according to his or her own desire as in the above-described system, the selection item corresponding to the user's desire may be located in a place where selection is difficult, and the user may feel troublesome.
The present invention has been made to solve such a problem, and an object of the present invention is to suppress a user from feeling troublesome when selecting an item.
Means for solving the problems
In order to solve the above problem, in the present invention, when a user is in a predetermined state while a selection item is displayed on a display unit, the selection item estimated to correspond to the user's desire in view of the user's predetermined state is displayed at a position that is easily selectable by the user.
Effects of the invention
According to the present invention configured as described above, since the plurality of selection items are not always displayed in a list in a predetermined order on the display unit, and when the user is in the predetermined state, the selection items estimated to correspond to the user's desire in view of the user being in the predetermined state are displayed at the easily selectable positions, it is possible to construct a state in which the selection items highly likely to be selected by the user are displayed at the easily selectable positions, and it is possible to suppress the user from feeling troublesome when selecting the selection items.
Drawings
Fig. 1 is a diagram showing a configuration example of a control system according toembodiment 1 of the present invention.
Fig. 2 is a block diagram showing an example of a functional configuration of the display control device according toembodiment 1 of the present invention.
Fig. 3 is a diagram showing an example of the initial home screen image.
Fig. 4 is a diagram showing an example of the adjustment home screen.
Fig. 5 is a flowchart showing an operation example of the display control device according toembodiment 1 of the present invention.
Fig. 6 is a block diagram showing an example of a functional configuration of the display control device according to embodiment 2 of the present invention.
Fig. 7 is a flowchart showing an example of the operation of the display control device according to embodiment 2 of the present invention.
Fig. 8 is a block diagram showing an example of a functional configuration of the display control device according toembodiment 3 of the present invention.
Fig. 9 is a flowchart showing an example of the operation of the display control device according toembodiment 3 of the present invention.
Description of the reference symbols
3. 3A, 3B display control device
4L left touch screen (display part)
4R Right touch screen (display part)
5L left in-vehicle camera (Camera, in-vehicle camera)
5R right vehicle interior camera (Camera, vehicle interior camera)
6L left microphone (microphone)
6R Right microphone (microphone)
8 vehicle interior space (specified space)
13 air conditioner
16. 16A, 16B state detecting part
17. 17A, 17B display control unit
25L left outside camera (outside camera)
25R Right exterior camera (exterior camera)
Detailed Description
<embodiment 1 >
Hereinafter,embodiment 1 of the present invention will be described with reference to the drawings. Fig. 1 is a diagram showing a configuration example of acontrol system 1 according to the present embodiment together with an internal structure of a vehicle 2 in which thecontrol system 1 is mounted. As shown in fig. 1, thecontrol system 1 is a system mounted on a vehicle 2, and includes adisplay control device 3, aleft touch panel 4L, aright touch panel 4R, a left in-vehicle photographing camera 5L, a right in-vehicle photographing camera 5R, aleft microphone 6L, and aright microphone 6R. The vehicle 2 illustrated in fig. 1 is an automobile for four passengers, and a leftrear seat 11 as a left rear seat and a rightrear seat 12 as a right rear seat are provided in addition to a driver seat 9 and afront passenger seat 10 in a vehicleinterior space 8 formed in the vehicle interior of the vehicle 2.
Theleft touch panel 4L is a device including a display panel such as a liquid crystal display panel or an organic EL panel and a touch panel disposed on the display panel in a superposed manner, and has a function of displaying an image and a function of detecting a touch operation performed by a user. Theleft touch panel 4L is basically a device used by a passenger seated in the leftrear seat 11, and is disposed on a back surface of the seat of thefront passenger seat 10 where the display area is visible to the passenger seated in the left rear seat 11 (hereinafter referred to as a "left passenger") and where the passenger can easily perform a touch operation on the touch operation area. Theright touch panel 4R is a similar device having the same function as theleft touch panel 4L, and is basically used by a passenger seated on the right rear seat 12 (hereinafter referred to as a "right passenger"). Theright touch panel 4R is provided on the back surface of the seat of the driver seat 9. Theleft touch panel 4L and theright touch panel 4R correspond to "display portions" of the claims, respectively. Hereinafter, theleft touch panel 4L and theright touch panel 4R are collectively referred to as "touch panels 4L and 4R".
The left in-vehicle photographing camera 5L is a photographing device that performs photographing at a predetermined cycle and outputs photographed image data based on the photographing result to thedisplay control device 3. The left in-vehicle photographing camera 5L is provided at a position where the upper half of the passenger including the face can be photographed when the passenger is seated on the leftrear seat 11. The right in-vehicle photographing camera 5R is a similar device having the same function as the left in-vehicle photographing camera 5L, and is provided at a position where the upper half of the body including the face of the passenger can be photographed when the passenger is seated on the rightrear seat 12. Hereinafter, the left in-vehicle photographing camera 5L and the right in-vehicle photographing camera 5R are collectively referred to as "in-vehicle photographing cameras 5L, 5R".
Theleft microphone 6L is a sound pickup device that picks up sound. Theleft microphone 6L is disposed at a position where the speech sound of the passenger seated in the leftrear seat 11 is picked up, taking into account the directivity characteristic and other characteristics. Theright microphone 6R is a similar device having the same function as theleft microphone 6L, and is disposed at a position where the speech sound of the occupant seated in the rightrear seat 12 is picked up. Hereinafter, theleft microphone 6L and theright microphone 6R are collectively referred to as "microphones 6L and 6R".
The vehicle 2 is provided with anair conditioner 13. Theair conditioner 13 is connected to thecontrol unit 14, and air in the vehicleinterior space 8 is conditioned under the control of thecontrol unit 14. In fig. 1, theair conditioner 13 and thecontrol unit 14 are simplified and represented by 1 block for convenience.
As shown in fig. 1, thedisplay control device 3 is connected to each of thetouch panels 4L and 4R, the in-vehicle cameras 5L and 5R, themicrophones 6L and 6R, and thecontrol unit 14.
Fig. 2 is a block diagram showing an example of a functional configuration of thedisplay control device 3 according to the present embodiment. As shown in fig. 2, thedisplay control device 3 includes astate detection unit 16 and adisplay control unit 17 as functional components. Each of thefunctional blocks 16 and 17 may be configured by any of hardware, a DSP (Digital Signal Processor), and software. For example, when the function blocks 16 and 17 are constituted by software, the function blocks are actually constituted by a CPU, a RAM, a ROM, or the like of a computer, and are realized by program operations stored in a recording medium such as a RAM, a ROM, a hard disk, or a semiconductor memory. The same applies to other embodiments described later.
Here, thedisplay control device 3 has a function of displaying images on thetouch panels 4L and 4R, and a function of inputting position coordinate signals from thetouch panels 4L and 4R and executing processing corresponding to a touch operation when thetouch panels 4L and 4R are touched. Thedisplay control device 3 can display a home screen on thetouch panels 4L and 4R. The home screen is the most basic screen among the plurality of prepared pages, and is the screen that is displayed first when thetouch panels 4L and 4R are powered on, and the user can display the home screen at an arbitrary timing. After the power of thetouch panels 4L and 4R is turned on, thedisplay control device 3 displays an initialhome screen image 19 in the following form as a home screen image.
Fig. 3 is a diagram showing an example of theinitial home screen 19 together with a rectangular dotted frame showing thedisplay area 20 of thetouch panels 4L and 4R. However, fig. 3 is extremely simplified for the sake of simplicity of explanation, in view of the gist of the embodiment. As shown in fig. 3, the length of the initialhome screen image 19 in the vertical direction is sufficiently longer than the length of thedisplay area 20 in the vertical direction, and only a part of the initialhome screen image 19 can be displayed in thedisplay area 20. In particular, at the stage when the initialhome screen image 19 is first displayed after the power is turned on, only the uppermost area of the initialhome screen image 19 is displayed in the display area 20 (the state of fig. 3). The user can cause an arbitrary area of the initialhome screen image 19 to be displayed in thedisplay area 20 by scrolling the screen with a flick (flick) or a swipe (swipe).
As shown in fig. 3, a plurality of item buttons 21 (corresponding to "selection items" in the claims) are arranged in the vertical direction on theinitial home screen 19. Theitem button 21 is a button that can be selected by the user through a touch operation. Theitem buttons 21 include anitem button 21X for the air conditioner 13 (hereinafter referred to as "airconditioner item button 21X"). The airconditioner item button 21X is a button selected when theair conditioner 13 is controlled or set. If the airconditioner item button 21X is selected, a detailed screen on which more detailed instructions can be given regarding the control and setting of theair conditioner 13 is displayed. The user can instruct the start/stop of the driving of theair conditioner 13, the change of the mode, or the change of the set temperature or the wind direction via the detailed screen.
As shown in fig. 3, in theinitial home screen 19, the air-conditioner item button 21X is disposed at the lowermost position among the plurality ofitem buttons 21. Therefore, if the configuration of theinitial home screen 19 shown in fig. 3 is always displayed as the home screen, the user who desires to select the airconditioner item button 21X needs to scroll the screen by a corresponding amount by a swipe or a swipe each time to bring the airconditioner item button 21X into a state where it is displayed in thedisplay area 20, and then select the button. Further, when the airconditioner item button 21X is selected, it is troublesome for the user to perform such a job every time. As described above, thedisplay control device 3 according to the present embodiment executes the following processing to suppress the user from feeling troublesome when selecting the airconditioner item button 21X. Hereinafter, assuming that the rider is seated on the leftrear seat 11, the process performed by thedisplay control device 3 for the rider (hereinafter, referred to as "left rider") will be described in detail.
While thedisplay control device 3 and theleft touch panel 4L are powered on, thestate detection unit 16 detects a state in which the left passenger feels hot or cold (hereinafter referred to as a "cold feeling state"). Thestate detection unit 16 performs 3 of the 1 st process, the 2 nd process, and the 3 rd process described below in parallel, and detects the cooling/heating sensation state of the left occupant.
< treatment No. 1 >
Thestate detection unit 16 receives captured image data that the left in-vehicle camera 5L outputs at predetermined intervals. Thestate detection unit 16 recognizes the face image (i.e., the image of the face of the left passenger) in the captured image data by an existing recognition technique, continuously analyzes the face image, and monitors whether or not the expression indicated by the face image changes from an expression that is not felt when it is hot or cold to an expression that is felt when it is hot or cold. The monitoring is performed based on an existing facial expression recognition technique. The monitoring may be performed using a model learned by deep learning or other mechanical learning methods. Thestate detection unit 16 detects that the left occupant is in the cold-heat feeling state when the expression indicated in the face image changes from an expression that is not felt by both heat and cold to an expression that is felt by hot or cold.
< treatment No. 2 >
Thestate detection unit 16 receives captured image data that the left in-vehicle camera 5L outputs at predetermined intervals. Thestate detection unit 16 recognizes an image of the upper body of the person (i.e., an image of the upper body of the left occupant) in the captured image data by an existing recognition technique, and continuously analyzes the image of the upper body to monitor whether or not the person performs an operation performed when the person feels hot or cold. The monitoring is performed based on an existing face motion recognition technique. The monitoring may be performed using a model learned by deep learning or other mechanical learning methods. The actions performed by a person when the person feels hot or cold include, for example, an action of wiping off sweat, an action of shaking and contracting, an action of putting on (or taking off) a garment, and the like. Thestate detection unit 16 detects that the left occupant is in the cold-heat feeling state when the human performs an operation performed when the human feels hot or cold.
< treatment No. 3 >
Thestate detection unit 16 receives the audio signal of the audio received from theleft microphone 6L, performs analog/digital conversion processing including sampling, quantization, and encoding on the audio signal, performs other signal processing to generate audio data, and buffers the audio data in a buffer. The buffer may be configured only when sound at a sound pressure level equal to or higher than a predetermined level is picked up by theleft microphone 6L. The speech sound of the left rider is picked up by theleft microphone 6L. Thestate detection unit 16 performs voice recognition on the cached voice data, and converts the voice data into text. Thestate detection unit 16 continuously analyzes the textual object, and detects that the left rider is in a cold-hot feeling state when a keyword registered in advance as a word that the person speaks when the person feels hot or cold appears. The keyword is, for example, "hot" or "cold", etc.
In the present embodiment, thestate detection unit 16 detects the cooling/heating sensation state when a keyword registered in advance appears in the target of converting the audio data into text. That is, thestate detection unit 16 may be configured to analyze the subject whose voice data is converted into text by natural language processing along the preceding and following contexts, and detect the cold-hot feeling state when a text indicating (or suggesting) that the subject feels hot or cold appears in the subject. In this configuration, the analysis of the object in which the sound data is converted into text may be performed using a model learned by deep learning or other mechanical learning methods.
Thestate detection unit 16 performs the above-described 3 processes in parallel, monitors whether or not the left passenger is in the cold feeling state, and detects the state when the left passenger is in the cold feeling state. Thestate detection unit 16, if detecting the cooling/heating sensation state of the left occupant, notifies thedisplay control unit 17 of this.
When thestate detection unit 16 does not receive a notification of detection of the cooling/heating sensation state of the left occupant, thedisplay control unit 17 displays the initial home screen image 19 (fig. 3) on theleft touch panel 4L when the home screen image is displayed. On the other hand, when receiving the notification of detection of the cooling/heating sensation state of the left occupant from thestate detection unit 16 and then displaying the home screen image, thedisplay control unit 17 displays the adjustment home screen image 23 (fig. 4) on theleft touch panel 4L. Further, when receiving a notification from thestate detection unit 16 that the cooling/heating sensation state of the left occupant is detected while the initialhome screen image 19 is displayed on theleft touch panel 4L, thedisplay control unit 17 displays the adjustmenthome screen image 23 in place of the initialhome screen image 19. Further, when thestate detection unit 16 receives a notification that the cooling sensation state of the left occupant is detected when theinitial home screen 19 is not displayed, thedisplay control unit 17 displays theadjustment home screen 23 instead of theinitial home screen 19 when the home screen is newly displayed.
Fig. 4 is a diagram showing theadjustment home screen 23. As shown in fig. 4, the airconditioner item buttons 21X are disposed uppermost on theadjustment home screen 23. Here, in the home screen image of the present embodiment, it can be said that the more theitem button 21 located at the upper side, the easier the left rider (user) can select. This is because, at the time of selection, there is a high possibility that the scroll is not performed, and even in the case where the scroll is required, there is a high possibility that the scroll amount is small and sufficient. Accordingly, when thedisplay control unit 17 receives a notification from thestate detection unit 16 that the cooling/heating sensation state of the left passenger is detected and displays the home screen, the airconditioner item button 21X is displayed at a position that is easily selectable by the user. With this configuration, the following effects are obtained.
That is, thestate detection unit 16 detects the cold and hot feeling state of the left occupant, and is a state in which the left occupant feels hot or cold, and it can be said that the left occupant is in a state or a highly likely state in which the left occupant desires to control theair conditioner 13 in order to eliminate the heat or cold felt by the left occupant. In such a situation, it can be said that the possibility of selection by the user in the future is high in consideration of the fact that the left occupant is in a cold-heated feeling state and the airconditioner item button 21X is anitem button 21 that is estimated to correspond to the desire of the left occupant to control theair conditioner 13. Further, according to the above configuration, since theinitial home screen 19 is not always displayed as the home screen, and theadjustment home screen 23 in which the airconditioner item button 21X is displayed at a position that is easily selectable is displayed when the user is in the cooling/heating sensation state, it is possible to reduce the possibility that a selection item that is highly likely to be selected by the user is displayed at a position that is difficult to be selected, and it is possible to suppress the user from feeling troublesome.
If the operation of thedisplay control unit 17 when the left rider selects the airconditioner item button 21X on theadjustment home screen 23 is simply described, thedisplay control unit 17 displays a detailed screen on theleft touch panel 4L if it is detected that the airconditioner item button 21X is selected. The detailed screen is a screen on which the user can give a more specific instruction regarding the control and setting of thedisplay control device 3. If the user instructs to start the cooling operation at a predetermined set temperature via the detailed screen, thedisplay controller 17 detects the instruction and outputs a corresponding control signal to thecontrol unit 14. Thecontrol unit 14 drives/controls theair conditioner 13 based on the control signal.
The processing executed by thedisplay control unit 17 to target theleft touch panel 4L based on the monitoring result of the left occupant is described above. However, thedisplay control device 3 may monitor the right occupant based on the input from the right in-vehicle photographing camera 5R and theright microphone 6R, and perform the same processing with theright touch panel 4R as the target based on the monitoring result.
Next, an operation example of thedisplay control device 3 will be described with reference to a flowchart. The flowchart of fig. 5 is a flowchart showing a flow of processing executed by thedisplay control device 3 for the left passenger and theleft touch panel 4L. It is assumed that thestate detection unit 16 does not detect that the user is in the cooling/heating sensation state at the start time of the flowchart of fig. 5. As shown in fig. 5, thestate detection unit 16 of thedisplay control device 3 monitors whether or not the user is in the cooling/heating sensation state (step SA 1). When it is detected that the user is in the cooling/heating sensation state (YES in step SA1), thedisplay controller 17 displays, as a home screen, theadjustment home screen 23 in which the airconditioner item button 21X is displayed at a position that is easy to select (step SA 2). As described above, when thestate detector 16 detects the cooling-heating sensation state while the initialhome screen image 19 is being displayed, thedisplay controller 17 displays the adjustmenthome screen image 23 in place of the initialhome screen image 19, and when thestate detector 16 detects the cooling-heating sensation state and is not displaying the initialhome screen image 19, the display controller displays the adjustmenthome screen image 23 when the home screen image is newly displayed.
< modification ofembodiment 1 >
Next, a modification ofembodiment 1 will be described. Inembodiment 1 described above, thestate detection unit 16 detects the cooling/heating sensation state of the user through the 1 st to 3 rd processes. In this regard, the method of detecting the cooling/heating sensation state of the user by thestate detection unit 16 is not limited to the method exemplified inembodiment 1. For example, thestate detection unit 16 may detect the cooling/heating sensation state of the user through any one of the 1 st to 3 rd processes or through any two processes. Thestate detector 16 may be configured to detect the cooling/heating sensation state by adding a temperature in the vehicleinterior space 8 and a temperature around the vehicle 2 in consideration of these temperatures or by using only these temperatures.
In the above-describedembodiment 1, the space in which thedisplay control device 3 is provided is the vehicleinterior space 8, and the vehicleinterior space 8 corresponds to the "predetermined space" in the claims. In this regard, the space in which thedisplay control device 3 is provided is not limited to the vehicleinterior space 8. For example, the space in which thedisplay control device 3 is installed may be a room of a house or an office in which an air conditioner is installed.
< embodiment 2 >
Next, embodiment 2 will be explained. In the description of embodiment 2, the same elements as those ofembodiment 1 are given the same reference numerals, and detailed description thereof is omitted. Fig. 6 is a block diagram showing an example of a functional configuration of thedisplay control device 3A according to the present embodiment. As shown in fig. 6, thedisplay control device 3A includes a state detection unit 16A instead of thestate detection unit 16, and includes adisplay control unit 17A instead of thedisplay control unit 17.
As shown in fig. 6, thedisplay control device 3A is connected to a left vehicleexterior photographing camera 25L and a right vehicle exterior photographing camera 25R. The left vehicleexterior photographing camera 25L is provided on the left side of the vehicle 2, and is a camera for photographing the left side of the vehicle 2. The right exterior photographing camera 25R is a camera provided on the right side of the vehicle 2 and photographs the right side of the vehicle 2.
In the present embodiment, the home screen displayed by thedisplay control unit 17A includes a shooting instruction button (corresponding to "an item for the vehicle exterior shooting camera" in the claims) as theitem button 21. When the shooting instruction button displayed on the home screen of theleft touch panel 4L is selected by the left passenger, the leftexterior shooting camera 25L performs shooting of a still image and outputs shot image data based on the shooting result. When the left passenger feels that the scene viewed through the window glass of the left door at the rear part of the vehicle 2 is beautiful and wants to photograph the scene, for example, the left passenger selects the photographing instruction button and causes the leftexterior photographing camera 25L to photograph the scene. Similarly, when the camera shooting instruction button displayed on the home screen of theright touch panel 4R is selected by the passenger seated in the rightrear seat 12, the right vehicle exterior shooting camera 25R performs shooting of a still image, and outputs shot image data based on the shooting result.
As shown in fig. 6, thedisplay control device 3A is connected to a position detection device 26. The position detection device 26 detects the current position of the vehicle 2 on the map at predetermined intervals based on the detection values of the GPS unit, the relative direction detection unit, and the like, and outputs information indicating the current position of the vehicle 2 to the state detection unit 16A of thedisplay control device 3A.
As shown in fig. 6, thedisplay control device 3A includes astorage unit 27, and the goodscenery map data 28 is stored in thestorage unit 27. The goodscene map data 28 is data in which an area with a good scene (hereinafter referred to as a "good scene area") on the map is registered. The good scenery area is an area where beautiful natural scenery exists, such as around a landmark, around an excellent spot, around a historical object, along a coast or a river, and there are objects and natural scenery where it is considered that many people want to photograph. In the goodscenery map data 28, the good scenery area is defined as, for example, a polygonal area formed by connecting a plurality of points on a map.
The following describes in detail the processing executed by thedisplay control device 3A for the left occupant and theleft touch panel 4L. In the present embodiment, it is assumed that the shooting instruction button is disposed at the lowermost position among theitem buttons 21 in the initial home screen displayed by thedisplay control device 3A.
The state detection unit 16A detects a state in which the left passenger feels that the left vehicleexterior photographing camera 25L intends to photograph the outside of the vehicle (hereinafter, referred to as a "photographing desired state"). The state detection unit 16A performs the 4 th process and the 5 th process described below in parallel, and detects a desired imaging state of the left passenger.
< treatment No. 4 >
Every time information indicating the current position of the vehicle 2 is input from the position detection device 26 at a predetermined cycle, the state detection unit 16A refers to the goodscenery map data 28 stored in thestorage unit 27, and determines whether or not the current position of the vehicle 2 belongs to a good scenery area. In this way, the state detection unit 16A recognizes that the current position of the vehicle 2 belongs to the good-sight area. The state detection unit 16A executes the following processing while the current position of the vehicle 2 belongs to the good-sight area. That is, the state detection unit 16A continuously analyzes the input from the left in-vehicle camera 5L, and monitors whether or not the left passenger looks out of the vehicle for a certain period of time (for example, 5 seconds) or longer through the window glass of the left door in the rear part of the vehicle 2. The monitoring is performed based on an existing motion recognition technique. The monitoring may be performed using a model learned by deep learning or other mechanical learning methods.
The state detection unit 16A detects a desired imaging state of the left passenger when the left passenger looks out of the vehicle for a certain time or longer. Here, the vehicle 2 is located in the good view area, and the left passenger looks out for a certain time or longer, and it can be said that the left passenger is in a state of paying attention (or a state in which attention is likely to be paid) to the view outside the vehicle, and it can be said that the left passenger feels (or a state in which attention is likely to be paid) to the view outside the vehicle, and that the left passenger feels a desire to photograph the outside of the vehicle with the left outside-vehicle photographing camera 25L.
< treatment No. 5 >
The state detector 16A continuously converts the speech sound of the left rider picked up by theleft microphone 6L into text. This process is performed by the method according to theprocess 3 ofembodiment 1. The state detection unit 16A continuously analyzes the textual object, and detects a photographing desired state when a keyword registered in advance as a word that a user desires to photograph outside the vehicle appears. The keywords are, for example, "will photograph outside", or "want to photograph", "beautiful scenery", or sentences similar to these sentences. Further, the following structure is also possible: the state detection unit 16A analyzes the textual object by natural language processing along the context before and after, and detects a desired imaging state when a word indicating (or suggestive of) that the user wants to image the outside of the vehicle appears. In this configuration, the analysis of the object in which the sound data is converted into text may be performed using a model learned by deep learning or other mechanical learning methods.
The state detection unit 16A performs the above two processes in parallel, monitors whether or not the left passenger is in the desired imaging state, and detects that the left passenger is in the desired imaging state. The state detection unit 16A, if detecting the desired imaging state of the left passenger, notifies thedisplay control unit 17A of this.
Thedisplay control unit 17A displays the initial home screen image when displaying the home screen image in a case where a notification of detection of the desired imaging state of the left passenger has not been received from the state detection unit 16A. On the other hand, when receiving a notification from the state detection unit 16A that the desired imaging state of the left passenger is detected and then displaying the home screen image, thedisplay control unit 17A displays the adjustment home screen image, not the initial home screen image. The adjustment home screen according to the present embodiment is a screen in which the shooting instruction button is displayed at the top, that is, a screen in which the shooting instruction button is displayed at a position that is easy to operate.
The structure of the present embodiment provides the following effects. That is, the state detection unit 16A detects the desired imaging state of the left passenger and the left passenger desires to take an image of the outside of the vehicle by the left outside-vehicle imaging camera 25L. Therefore, it can be said that the possibility of future selection by the user is high in view of the fact that the left rider is in the desired photographing state and the photographing instruction button is theitem button 21 which is estimated to correspond to the desire of the left rider to photograph the outside of the vehicle. Further, according to the above configuration, since the initial home screen is not always displayed as the home screen, and the adjustment home screen in which the photographing instruction button is displayed at the position which is easily selectable is displayed when the user is in the photographing desired state, it is possible to construct a state in which the selection item which is highly likely to be selected by the user is displayed at the position which is easily selectable, and it is possible to suppress the user from feeling troublesome.
If the operation of thedisplay control unit 17A in the case where the left passenger selects the shooting instruction button on the adjustment home screen is simply described, thedisplay control unit 17A outputs a corresponding control signal to the left vehicleexterior shooting camera 25L if it is detected that the shooting instruction button is selected. The left vehicleexterior photographing camera 25L performs photographing of a still image based on the control signal, and outputs photographed image data based on the photographing result to thedisplay control unit 17A. Thedisplay control unit 17A executes corresponding processing if the captured image data is input. For example, thedisplay control unit 17A displays an image based on the photographed image data on theleft touch panel 4L, or stores the photographed image data in a predetermined file format.
Although the processing executed by thedisplay control unit 17A for the left occupant and theleft touch panel 4L has been described above, thedisplay control device 3A executes the same processing for the right occupant and theright touch panel 5L based on the input from the right in-vehicle shooting camera 5R and theright microphone 6R.
Next, an operation example of thedisplay control device 3A according to the present embodiment will be described with reference to a flowchart. Fig. 7 is a flowchart showing a flow of processing executed by thedisplay control device 3A for the left occupant and theleft touch panel 4L. It is assumed that the state detection unit 16A has not detected that the user is in the desired imaging state at the start of the flowchart of fig. 7. As shown in fig. 7, the state detection unit 16A of thedisplay control apparatus 3A monitors whether or not the user is in the desired photographing state (step SB 1). When it is detected that the user has reached the desired photographing state (YES in step SB1), thedisplay controller 17A displays a home screen image, and displays an adjustment home screen image in which a photographing instruction button is displayed at a position that can be easily selected (step SB 2). As described above, when the state detection unit 16A detects the imaging desired state while the initial home screen image is being displayed, thedisplay control unit 17A displays the adjustment home screen image in place of the initial home screen image, and when the state detection unit 16A detects that the imaging desired state is not being displayed on the initial home screen image, the display control unit displays the adjustment home screen image when the home screen image is newly displayed.
< modification of embodiment 2 >
Next, a modification of embodiment 2 will be described. The process of detecting the desired imaging state of the user by the state detector 16A is not limited to the process described in embodiment 2. As an example, in the 4 th process, the state detection unit 16A may detect that the user is in the desired photographing state when the user (the left passenger in the above-described 2 nd embodiment) looks out of the vehicle for a certain time or longer, regardless of whether the position of the vehicle 2 belongs to the good-sight area. In addition, the condition "view outside the vehicle in a predetermined form" used for detecting the desired imaging state may be configured such that the state detection unit 16A uses another condition instead of the condition "the user looks out of the vehicle for a certain time period or longer" in embodiment 2. Other conditions are, for example, a condition that "duringperiod 1, the accumulation of time outside the user's gaze is above a threshold". The state detector 16A may perform either one of the 4 th process and the 5 th process.
Note that thedisplay control apparatus 3B according to embodiment 2 may also be configured to simultaneously execute the processing of thedisplay control apparatus 3 according toembodiment 1. Namely, the following structure: the state detection unit 16A detects not only the photographing desired state of the user but also the cooling/heating sensation state, and thedisplay control unit 17A displays the airconditioner item button 21X at a position that is easy to select when the user is in the cooling/heating sensation state.
<embodiment 3 >
Next,embodiment 3 will be explained. In the description ofembodiment 3, the same elements as those ofembodiment 1 are given the same reference numerals, and detailed description thereof is omitted. Fig. 8 is a block diagram showing an example of a functional configuration of thedisplay control device 3B according to the present embodiment. As shown in fig. 8, thedisplay control device 3B includes astate detection unit 16B instead of thestate detection unit 16, and includes adisplay control unit 17B instead of thedisplay control unit 17.
As shown in fig. 8, thedisplay control device 3B includes acontent reproduction unit 30 as a functional configuration thereof. Here, in the present embodiment, an audio file in which music is recorded and a video file in which video (such as video relating to a movie) is recorded are stored in a predetermined storage area of thedisplay control device 3B. Thecontent reproduction unit 30 has a function of reproducing music or video of the audio file stored in the storage area and outputting the audio signal and the video signal to thetouch panels 4L and 4R. The music or video reproduced by thecontent reproduction unit 30 corresponds to the "content" of the claims, and in particular, both the music and the video are content accompanied by the output of sound, and correspond to the "content accompanied by the use of the sound output device". Further, the in-vehicle space 8 can be said to be a space capable of providing content to the user. Hereinafter, music and video are collectively referred to as "music and the like", and listening to music or video by a user is collectively referred to as "viewing and listening" or the like.
In the present embodiment, the home screen displayed by thedisplay control unit 17B includes, as theitem buttons 21, content-related buttons (corresponding to "items related to content" in the claims) selected when control or setting related to reproduction of music or the like is performed. Then, if the content-related button of the home screen displayed on theleft touch panel 4L is selected by the passenger seated in the leftrear seat 11, a detailed screen that can give more detailed instructions on control and setting related to reproduction of music or the like is displayed on theleft touch panel 4L. The left rider can instruct reproduction of a specific music piece or a specific video, pause reproduction of a music piece or the like, or adjustment of volume via the detailed screen. The same applies to theright touch panel 4R.
Thetouch panels 4L and 4R are provided with connectors into which the terminals of the earphones are inserted. The left passenger or the right passenger can listen to the sound via the headphones during reproduction of music or the like by wearing the headphones. In the present embodiment, the headset is a generic term of an audio output device used by being worn on the ear or the head, and a so-called headphone is also included in the concept of the headset.
Hereinafter, the processing executed by thedisplay control device 3B for the left occupant and theleft touch panel 4L will be described in detail. In the present embodiment, it is assumed that the content-related button of theitem buttons 21 is disposed at the lowermost position on the initial home screen displayed by thedisplay control device 3B.
Thestate detection unit 16B detects that the left rider feels a state of viewing music or the like (hereinafter referred to as "a desired state of providing". Thestate detection unit 16B performs the following process 6 and process 7 in parallel, and detects the desired state of the left passenger.
< treatment No. 6 >
Thestate detection unit 16B analyzes captured image data input from the left in-vehicle camera 5L at a predetermined cycle, and monitors whether or not the left passenger wears headphones. The monitoring is performed based on an existing motion recognition technique. The monitoring may be performed using a model learned by deep learning or other mechanical learning methods. Thestate detection unit 16B detects the supply desired state of the left passenger when the left passenger is wearing the headphone from the state in which the left passenger is not wearing the headphone. When the left passenger wears the headphones, the left passenger may wear the headphones for preparation before the music or the like is reproduced, and therefore, when the left passenger wears the headphones, the left passenger may want to view the music or the like.
< treatment No. 7 >
Thestate detector 16B continuously converts the speech sound of the left rider picked up by theleft microphone 6L into text. This process is performed by the method according to theprocess 3 ofembodiment 1. Thestate detection unit 16B continuously analyzes the textual object, and detects a provision desired state when a keyword registered in advance as a word spoken when the user desires to view music or the like appears. The keywords are, for example, "want to listen to music" or "want to watch a movie" or sentences similar to these sentences. Further, the following structure is also possible: thestate detection unit 16B analyzes the textual object by the natural language processing along the preceding and following contexts, and detects the providing of the desired state when a word indicating (or suggestive) that the user wants to view the music or the like appears. In this configuration, the analysis of the object in which the sound data is converted into text may be performed using a model learned by deep learning or other mechanical learning methods.
Thestate detection unit 16B performs the above two processes in parallel, monitors whether or not the left passenger is in the supply-desired state, and detects the state when the left passenger is in the supply-desired state. Thestate detection unit 16B, if detecting the left occupant's supply desire state, notifies thedisplay control unit 17B of the fact.
Thedisplay control unit 17B displays the initial home screen image when displaying the home screen image in a case where a notification that the supply desire state of the left passenger has not been received from thestate detection unit 16B. On the other hand, when receiving the notification of detecting the supply desired state of the left passenger from thestate detecting unit 16B and then displaying the home screen image, thedisplay control unit 17B displays the adjustment home screen image, instead of the initial home screen image. The adjustment home screen image according to the present embodiment is a screen image in which the content-related buttons are displayed on the top, that is, a screen image in which the content-related buttons are displayed at positions that are easy to operate.
The structure of the present embodiment provides the following effects. That is, thestate detection unit 16B detects that the left passenger desires the left passenger to view music or the like. Therefore, it can be said that the possibility of future selection by the user is high in consideration of the fact that the left rider provides a desired state and the content-related button is theitem button 21 presumed to correspond to the desire of the left rider to view music or the like. Further, according to the above configuration, since the initial home screen image is not always displayed as the home screen image, and the adjustment home screen image in which the content-related button is displayed at the position that can be easily selected is displayed when the user is in the desired state, it is possible to construct a state in which the selection item that is highly likely to be selected by the user is displayed at the position that can be easily selected, and it is possible to suppress the user from feeling troublesome.
If the operation of thedisplay control unit 17B when the left rider selects the content-related button on the adjustment home screen image is simply described, thedisplay control unit 17B displays a detailed screen on theleft touch panel 4L when it detects that the content-related button is selected. The detailed screen is a screen on which the user can give a more specific instruction for controlling and setting the reproduction of music or the like. If the user instructs to start reproduction of a specific musical composition, thedisplay control unit 17B detects the instruction and outputs a corresponding control signal to thecontent reproduction unit 30. Thecontent reproduction unit 30 starts reproduction of a specific musical composition based on the control signal.
Although the processing executed by thedisplay control unit 17B for the left occupant and theleft touch panel 4L has been described above, thedisplay control unit 17B executes the same processing for the right occupant and theright touch panel 4R based on the input from the right in-vehicle shooting camera 5R and theright microphone 6R.
Next, an operation example of thedisplay control device 3B will be described with reference to a flowchart. Fig. 9 is a flowchart showing a flow of processing executed by thedisplay control device 3B for the left occupant and theleft touch panel 4L. It is assumed that at the start time of the flowchart of fig. 9, thestate detection unit 16B has not detected that the user is providing the desired state. As shown in fig. 9, thestate detection unit 16B of thedisplay control apparatus 3B monitors whether or not the user is in the provision desired state (step SC 1). When it is detected that the user is in the supply desired state (YES in step SC1), thedisplay controller 17B displays, as a home screen, an adjustment home screen in which content-related buttons are displayed at positions that are easy to select (step SC 2). As described above, when thestate detection unit 16B detects the supply desired state while the initial home screen image is being displayed, thedisplay control unit 17B displays the adjustment home screen image in place of the initial home screen image, and when thestate detection unit 16B detects that the supply desired state is not being displayed on the initial home screen image, the display control unit displays the adjustment home screen image when the home screen image is newly displayed.
< modification ofembodiment 3 >
Next, a modification ofembodiment 3 will be described. The process of thestate detection unit 16B detecting the provision desired state of the user (left occupant in the above-described embodiment 3) is not limited to the process exemplified in the above-describedembodiment 3. For example, thestate detection unit 16B may detect the user's provided desired state by performing only one of the process 6 and the process 7. Inembodiment 3, the space in which thedisplay control device 3B is provided is the vehicleinterior space 8, and the vehicleinterior space 8 corresponds to the "predetermined space" in the claims. In this regard, the space in which thedisplay control device 3B is provided is not limited to the vehicleinterior space 8. For example, the space in which thedisplay control device 3B is installed may be a single room of a house or an office in which an air conditioner is installed.
In the above-described embodiment, thecontent reproduction unit 30 is configured to reproduce music and the like of an audio file or a video file stored in the storage area of thedisplay control device 3B, but may be configured to reproduce music and the like recorded on various media such as a DVD or a CD, or may be configured to reproduce music and the like based on various files stored in an external device communicably connected to thedisplay control device 3B. The function of thecontent reproduction unit 30 may be an external device communicably connected to thedisplay control device 3B. Further, the content provided is not limited to video or music. For example, television broadcasting, radio broadcasting, or the like may be used.
Thedisplay control device 3B according toembodiment 3 may be configured to simultaneously execute the processing of thedisplay control device 3 according toembodiment 1 and the processing of thedisplay control device 3A according to embodiment 2. That is, the following structure may be adopted: thestate detection unit 16B detects not only the supply desired state of the user but also the cooling-heating sensation state or the photographing desired state, and thedisplay control unit 17B displays the airconditioner item button 21X or the photographing instruction button at a position that is easy to select when the user is in the cooling-heating sensation state or the photographing desired state. Thedisplay control device 3B according toembodiment 3 may be configured to execute the processing of thedisplay control device 3 according toembodiment 1 and the processing of thedisplay control device 3A according to embodiment 2 at the same time. That is, the following structure may be adopted: thestate detection unit 16B detects not only the provision desired state of the user but also the cooling-heating sensation state and the photographing desired state, and thedisplay control unit 17B displays the airconditioner item button 21X at a position that is easy to select when the user is in the cooling-heating sensation state, and displays the photographing instruction button at a position that is easy to select when the user is in the photographing desired state.
While the embodiments (including the modified examples) of the present invention have been described above, the above embodiments are merely examples of embodying the present invention, and the technical scope of the present invention is not to be construed in a limited manner. That is, the present invention can be implemented in various forms without departing from the gist or main features thereof.
For example, inembodiment 1 described above, thetouch panels 4L and 4R are set at positions that can be used by the left and right occupants, respectively, but a configuration in which the touch panel is provided at the passenger seat is also possible. The same applies to the other embodiments.
Inembodiment 1, the screen displaying the list of the selection items (item buttons 21) is a home screen, but the screen displaying the list of the selection items may be a screen not functioning as a home screen. In addition, the form of displaying the list of the selection items on the screen may be other than the form of arranging the selection items in the vertical direction as inembodiment 1. For example, a plurality of pages that can be mutually transitioned may be tiled to display a plurality of selection items (for example, items indicated by icons). In this case, the position of theitem button 21 that is easily selected by the user is a page (hereinafter referred to as "top page") that is displayed first by default among the plurality of pages. This is because, when oneitem button 21 is displayed on such a page, there is no need for a job of making the page transition when the oneitem button 21 is selected. Accordingly, in this case, if thestate detection unit 16 detects the cooling/heating sensation state of the user, thedisplay control unit 17 displays the airconditioner item button 21X on the top page. The same applies to the other embodiments.
In addition, in the above-describedembodiment 1, thedisplay control unit 17 positions the airconditioner item button 21X at the highest position on the adjustmenthome screen image 23, but may not be the highest position, and may be, for example, the 2 nd or 3 rd position from the top. However, from the viewpoint of disposing the airconditioner item button 21X at a position that is easily selectable by the user, it is desirable that the airconditioner item button 21X be displayed at a position where the screen is not scrolled. The same applies to the other embodiments.
In addition, inembodiment 1, all or part of the processing performed by the functional blocks of thedisplay control device 3 may be performed by an external device connected to thedisplay control device 3. The external device is, for example, a cloud server connected to the network N. In this case, thedisplay control device 3 and the external device function cooperatively as a "display control device". The same applies to the other embodiments.

Claims (15)

1. A display control apparatus is characterized in that,
the disclosed device is provided with:
a display control unit that displays selection items selectable by a user on a display unit; and
a state detection unit that detects that a user is in a predetermined state;
the display control unit displays the selection item, which is estimated to correspond to a user's desire in consideration of the user's predetermined state, at a position that is easily selectable by the user, when the state detection unit detects that the user is in the predetermined state while the selection item is displayed on the display unit.
2. The display control apparatus according to claim 1,
is arranged in a prescribed space where a user stays;
an air conditioning device is provided in the predetermined space;
the selection item includes an item related to the air conditioner;
the state detection part detects that the user feels hot or cold;
when the state detection unit detects the state, the display control unit displays the selection item related to the air conditioner at a position that is easily selected by a user.
3. The display control apparatus according to claim 2,
a camera capable of photographing a user is provided in the predetermined space;
the state detection unit analyzes the expression of the user based on the result of the image capturing by the camera, and detects that the user is in a hot or cold state.
4. The display control apparatus according to claim 2,
a camera capable of photographing a user is provided in the predetermined space;
the state detection unit analyzes the movement of the user based on the result of the image capturing by the camera, and detects whether the user feels hot or cold.
5. The display control apparatus according to claim 2,
a microphone for inputting the speech sound of the user is arranged in the specified space;
the state detection unit analyzes the content of the user's speech sound input to the microphone, and detects that the user is in a hot or cold state.
6. The display control apparatus according to claim 2,
the predetermined space is an interior space formed in a vehicle interior of the vehicle.
7. The display control apparatus according to claim 1,
is provided in an in-vehicle space formed in a vehicle of a vehicle;
the vehicle is provided with an outside-vehicle photographing camera for photographing the outside of the vehicle;
the selection item includes an item related to the exterior photographing camera;
the state detection unit detects that a user feels a state in which the user wants to photograph the outside of the vehicle by the vehicle exterior photographing camera;
when the state detection unit detects the state, the display control unit displays the selection item related to the exterior photographing camera at a position that is easily selected by a user.
8. The display control apparatus according to claim 7,
an in-vehicle photographing camera capable of photographing a user is provided in the in-vehicle space;
the state detection unit monitors whether or not the user views the outside of the vehicle in a predetermined manner based on the imaging result of the in-vehicle imaging camera, and detects that the user is in a state of attempting to image the outside of the vehicle by the outside of the vehicle imaging camera when viewing the outside of the vehicle in the predetermined manner.
9. The display control apparatus according to claim 8,
the state detection unit may detect that the scene around the vehicle is good, and when the scene around the vehicle is good and the user views the outside of the vehicle in the predetermined form, the state detection unit may detect that the user feels that the user wants to photograph the outside of the vehicle using the outside-of-vehicle photographing camera.
10. The display control apparatus according to claim 7,
a microphone for inputting the speaking voice of the user is arranged in the space in the vehicle;
the state detection unit analyzes the content of the user's speech sound input to the microphone, and detects that the user feels a state in which the user wants to photograph the outside of the vehicle using the outside-of-vehicle photographing camera.
11. The display control apparatus according to claim 1,
is provided in a prescribed space capable of providing a prescribed content to a user;
the selection item includes an item related to the predetermined content;
the state detection unit detects that the user is in a state of accepting the provision of the predetermined content;
when the state detection unit detects the state, the display control unit displays the selection item related to the predetermined content at a position that is easily selected by a user.
12. The display control apparatus according to claim 11,
the predetermined content is a content accompanying the use of an audio output device that is used by being worn on the ear or the head;
a camera capable of photographing a user is provided in the predetermined space;
the state detection unit monitors whether the user wears the audio output device based on the result of the photographing by the camera, and detects that the user wants to receive a state accompanied by the provision of the content using the audio output device when the user wears the audio output device;
the display control unit displays the selection item related to the content associated with the use of the audio output device at a position that is easily selectable by the user when the state detection unit detects the state.
13. The display control apparatus according to claim 11,
a microphone for inputting the speech sound of the user is arranged in the specified space;
the state detection unit analyzes the content of the speech sound input to the microphone, and detects that the user feels a state of accepting the provision of the predetermined content.
14. The display control apparatus according to claim 11,
the predetermined space is an in-vehicle space formed in a vehicle of the vehicle.
15. A display control method characterized by comprising, in a display control unit,
the method comprises the following steps:
a step in which a state detection unit of the display control device detects that a user is in a predetermined state;
and a step in which, when the state detection unit detects that the user is in the predetermined state when the selection item is displayed on the display unit, the display control unit of the display control device displays the selection item, which is estimated to correspond to the user's desire in view of the user's presence in a position that is easily selectable by the user.
CN202010691620.XA2020-07-172020-07-17Display control device and display control methodPendingCN113946233A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202010691620.XACN113946233A (en)2020-07-172020-07-17Display control device and display control method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202010691620.XACN113946233A (en)2020-07-172020-07-17Display control device and display control method

Publications (1)

Publication NumberPublication Date
CN113946233Atrue CN113946233A (en)2022-01-18

Family

ID=79327048

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202010691620.XAPendingCN113946233A (en)2020-07-172020-07-17Display control device and display control method

Country Status (1)

CountryLink
CN (1)CN113946233A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105966329A (en)*2015-03-122016-09-28曼卡车和巴士股份公司Device and method for supporting a driver of a vehicle, in particular a commercial vehicle
CN106716407A (en)*2014-09-152017-05-24株式会社电装Vehicle equipment control apparatus and method for searching control content
CN107526500A (en)*2016-06-222017-12-29斑马网络技术有限公司Method for adjusting functions, device, equipment, interface system and control device
US20180217717A1 (en)*2017-01-312018-08-02Toyota Research Institute, Inc.Predictive vehicular human-machine interface
CN109109782A (en)*2017-06-262019-01-01Lg电子株式会社interface system for vehicle
CN111277755A (en)*2020-02-122020-06-12广州小鹏汽车科技有限公司Photographing control method and system and vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN106716407A (en)*2014-09-152017-05-24株式会社电装Vehicle equipment control apparatus and method for searching control content
CN105966329A (en)*2015-03-122016-09-28曼卡车和巴士股份公司Device and method for supporting a driver of a vehicle, in particular a commercial vehicle
CN107526500A (en)*2016-06-222017-12-29斑马网络技术有限公司Method for adjusting functions, device, equipment, interface system and control device
US20180217717A1 (en)*2017-01-312018-08-02Toyota Research Institute, Inc.Predictive vehicular human-machine interface
CN109109782A (en)*2017-06-262019-01-01Lg电子株式会社interface system for vehicle
CN111277755A (en)*2020-02-122020-06-12广州小鹏汽车科技有限公司Photographing control method and system and vehicle

Similar Documents

PublicationPublication DateTitle
US12182387B1 (en)Automatically adjusting media display in a personal display system
US20250068383A1 (en)Devices with enhanced audio
CN108028957B (en) Information processing apparatus, information processing method, and machine-readable medium
KR20210011416A (en) Shared environment for vehicle occupants and remote users
CN113614686B (en) System and method for viewing occupant status and managing equipment of a building
CN114339335A (en) A vehicle-mounted multimedia theater system and its control method
JP2005250322A (en) Display device
CN110996163B (en)System and method for automatic subtitle display
JP2017219746A (en)Voice output controller and voice output control program
US20250110631A1 (en)Techniques for changing display of controls
US20250110625A1 (en)Techniques for displaying different controls
JP2024007600A (en) In-vehicle device and sound image localization position adjustment method
JP2020188390A (en)Entertainment control device and entertainment control method
CN113946233A (en)Display control device and display control method
JP2019211529A (en)Voice memory device
CN115315374B (en) Sound data processing device and sound data processing method
WO2024001091A1 (en)Method and apparatus for controlling vehicle assembly, and electronic device and readable storage medium
JP2010221893A (en) In-vehicle information equipment
US20250110630A1 (en)User interfaces and techniques for displaying information
US20250296439A1 (en)Cabin environment control apparatus
US20250296401A1 (en)Cabin environment control apparatus
JP2025145175A (en) Vehicle cabin environment control device
CN117885668A (en)Anti-carsickness control method and device, vehicle, medium and equipment
JP2021144262A (en)Voice output control device, mobile body including the same, and voice output control method
KR20150089466A (en)Display device and method for controlling the same

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp