Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of a control method of an in-vehicle atmosphere lamp, it should be noted that the steps shown in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is shown in the flowchart, in some cases the steps shown or described may be performed in an order different from that herein.
The method embodiments may be performed in an electronic device or similar computing device that includes a memory and a processor. Taking an example of operation on a vehicle terminal, the vehicle terminal may include one or more processors (which may include, but are not limited to, a central Processing unit (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), a Digital Signal Processing (DSP) chip, a microprocessor (Micro Controller Unit, MCU), a programmable logic device (Field Programmable GATE ARRAY, FPGA), a neural network processor (Neural-network Processor Unit, NPU), a tensor processor (Tensor Processing Unit, TPU), an artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) type processor, etc., and a memory for storing data. Alternatively, the vehicle terminal may further include a transmission device, an input-output device, and a display device for a communication function. It will be appreciated by those skilled in the art that the above description of the structure is merely illustrative and is not intended to limit the structure of the vehicle terminal. For example, the vehicle terminal may also include more or fewer components than the above structural description, or have a different configuration than the above structural description.
The memory may be used to store a computer program, for example, a software program of application software and a module, for example, a computer program corresponding to the control method of the vehicle-mounted atmosphere lamp in the embodiment of the present invention, and the processor executes various functional applications and data processing by running the computer program stored in the memory, that is, implements the control method of the vehicle-mounted atmosphere lamp. The memory may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory. In some examples, the memory may further include memory remotely located with respect to the processor, the remote memory being connectable to the mobile terminal through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission means is used for receiving or transmitting data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission means comprises a network adapter (Network Interface Controller, simply referred to as NIC) that can be connected to other network devices via a base station to communicate with the internet. In one example, the transmission device may be a Radio Frequency (RF) module, which is used to communicate with the internet wirelessly.
Display devices may be, for example, touch screen type liquid crystal displays (Liquid Crustal Display, LCDs) and touch displays (also referred to as "touch screens" or "touch display screens"). The liquid crystal display may enable a user to interact with a user interface of the mobile terminal. In some embodiments, the mobile terminal has a graphical user interface (GRAPHICAL USER INTERFACE, GUI) with which a user can interact with the GUI by touching finger contacts and/or gestures on the touch-sensitive surface, where the human-machine interaction functionality optionally includes interactions such as creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, sending and receiving electronic mail, talking interfaces, playing digital video, playing digital music and/or web browsing, etc., executable instructions for performing the human-machine interaction functionality described above are configured/stored in one or more processor-executable computer program products or readable storage mediums.
Fig. 1 is a flowchart of a control method of an in-vehicle atmosphere lamp according to one embodiment of the present invention, as shown in fig. 1, the method includes the steps of:
step S10, a current scene mode of a vehicle-mounted central control screen is obtained, wherein the current scene mode comprises one of an entertainment mode, a rest mode and an interaction mode;
In step S10, the entertainment mode is used to characterize the video resources such as the music album art or the movie currently being played by the on-board central control screen.
The rest mode is used for representing the relaxed dynamic scenes such as forests, oceans, greenbelts and the like in the vehicle-mounted central control screen.
The interaction mode is used for representing that drivers and passengers interact with the vehicle-mounted atmosphere lamp through the central control screen.
Step S12, analyzing the current scene mode by using a preset color extraction model, and determining initial tone parameters of the vehicle-mounted atmosphere lamp;
In step S12, the preset color extraction model is used to characterize a color extraction model based on a deep learning algorithm.
The initial Hue parameters described above are used to characterize a set of parameters describing the color characteristics, including but not limited to Hue (Hue), which describes the basic type of color, which is the location of the color on the color wheel, typically expressed in terms of angles or values. Such as red, green, blue, etc. Saturation (Saturation) describes the purity or intensity of a color, high Saturation means that the color is vivid, and low Saturation means that the color is closer to gray. Luminance (brightness/Value) describes the darkness of a color, a color with high luminance appearing brighter and a color with low luminance appearing darker. Shading (Shade) adding black to the color to change the brightness and saturation of the color. For example, the shade of red may be dark red. Color temperature (Color Temperature) parameters describing the color of the light, usually in Kelvin (K), the color with the higher color temperature is bluish and the color with the lower color temperature is reddish.
Specifically, the preset color extraction model extracts color information from the content displayed on the vehicle-mounted central control screen of the current scene mode, and an initial tone parameter suggestion is given.
Specifically, first, a large-scale image dataset is required, and the images contain rich color information. Each image in the dataset should have a corresponding label that identifies the dominant color feature in the image. The image is preprocessed, e.g., scaled, normalized, etc. The predetermined color extraction model may then be designed using a neural network such as convolutional neural network (Convolutional Neural Network, CNN). CNNs include, but are not limited to, VGG, resNet, etc. network structures. The input layer of the network is designed to accept the preprocessed image data. A convolution layer, a pooling layer and a full connection layer are added to extract image features. One or more output layers are added at the end of the network for predicting the hue parameters. The labeled data set training model is then used to enable it to learn how to extract the hue parameters from the image. The data set is divided into a training set, a validation set and a test set. The prediction of the model is optimized using a cross entropy loss function. The weights of the model are updated using an optimization algorithm such as gradient descent. The validation set is used periodically to check the performance of the model and make adjustments. The performance of the model is evaluated on the test set, ensuring that it can accurately extract the hue parameters. The model was evaluated using the test set. And calculating indexes such as accuracy and recall rate of the model. Model parameters or structures are adjusted as needed. Finally, when the accuracy of the preset color extraction model reaches a preset threshold, the color extraction model can be used for extracting the tone parameters of the new image.
Further, image information input into the preset color extraction model is determined according to the current scene mode. The preset color extraction model will output the predicted hue parameters. Post-processing is performed on the output, such as converting the angle of hue into a numerical value, etc., to output an initial hue parameter. For example, for image information input into the preset color extraction model, the preset color extraction model outputs a prediction result of 30 degrees hue, 0.8 saturation, and 0.6 brightness.
Step S14, adjusting the initial tone parameters based on a preset color strategy and the historical tone parameters to obtain target tone parameters;
in step S14, the preset color policy is used to characterize a color psychology application policy for adjusting colors according to scenes to create a specific atmosphere. The initial hue parameters are adjusted based on different color effects on the mood of the person.
The historical tone parameters are used for representing historical adjustment parameters of the vehicle-mounted atmosphere lamp by a user, and include, but are not limited to, information such as color brightness, color preference and the like. The historical hue parameters are analyzed to find out the user's preferences in different situations.
Specifically, according to a color psychology application strategy and historical adjustment parameters of a user on the vehicle-mounted atmosphere lamp, initial tone parameters extracted based on a preset color extraction model are adjusted to obtain target tone parameters. For example, based on a preset color extraction model, it is determined that the initial hue parameter of the vehicle-mounted atmosphere lamp includes a hue of 30 degrees (yellow series), a saturation of 0.8, and a brightness of 0.6, and according to the preference of the user (the user prefers soft light) and the color psychology strategy (yellowish helps to relax), the initial hue parameter is adjusted to be softer yellowish, and adjusted to have a hue of 30 degrees, a saturation of 0.7, and a brightness of 0.5.
And S16, controlling the vehicle-mounted atmosphere lamp to output lamplight with a corresponding tone by utilizing the target tone parameter.
Specifically, the target tone parameter is used to control a specific color parameter of the atmosphere lamp. And outputting corresponding light colors and brightness by the vehicle-mounted atmosphere lamp according to the target tone parameters.
Based on the steps S10 to S16, in the embodiment of the present invention, a manner of acquiring the current scene mode of the vehicle-mounted central control screen is adopted, a preset color extraction model is utilized to analyze the current scene mode, an initial tone parameter of the vehicle-mounted atmosphere lamp is determined, the initial tone parameter is adjusted based on a preset color strategy and a history tone parameter, a target tone parameter is obtained, finally, the vehicle-mounted atmosphere lamp is controlled to output light with a corresponding tone by using the target tone parameter, and the purpose of controlling the real-time interaction between the vehicle-mounted atmosphere lamp and the central control screen is achieved, so that the technical effects of improving the flexibility of the vehicle-mounted atmosphere lamp and the driving experience are achieved, and further, the technical problem that the driving experience is poor due to the fact that the vehicle-mounted atmosphere lamp cannot realize real-time interaction with the central control screen in the related art is solved.
Optionally, in step S12, analyzing the current scene mode with the preset color extraction model, and determining the initial tone parameter of the vehicle-mounted atmosphere lamp includes:
Step S1211, in response to the current scene mode being an entertainment mode or a rest mode, capturing image information and audio information of a picture currently displayed by the vehicle-mounted central control screen according to a preset time interval;
Specifically, according to the current scene mode selected by the user on the vehicle-mounted central control screen, the picture image information is analyzed by using a preset color extraction model so as to determine the initial tone parameter. And if the current scene mode is an entertainment mode or a rest mode, intercepting a current picture image displayed in the vehicle-mounted central control screen according to a preset time interval, and extracting audio information. For example, the current screen image displayed in the in-vehicle center screen is taken at intervals such as every 5 seconds or every 10 seconds, and audio information is extracted.
In step S1212, the picture image information is analyzed by using the preset color extraction model to determine the hue parameter, and the audio information is analyzed by using the preset color extraction model to determine the brightness parameter.
Specifically, for audio information, the audio features in the audio information are converted into luminance parameters using a deep learning network in a preset color extraction model. Deep learning networks for processing audio information include, but are not limited to, CNN, generating a countermeasure Network (GAN), deep embedded clustering (Deep Embedding Clustering), and the like. For example, training a CNN to identify specific patterns and features in audio may convert the audio features to color or brightness parameters, in the audio-to-color conversion, GAN may be used to generate color or brightness parameters that match the audio features, a generator may learn a mapping of audio to colors, and a arbiter may ensure that the generated color parameters are similar to the actual color parameters, depth-embedded clusters (Deep Embedding Clustering) may be used to cluster the audio data into different categories, each of which may be associated with a specific color or brightness parameter.
Further, the predetermined color extraction model analyzes the captured picture image and audio information, and for the picture image information, the predetermined color extraction model analyzes the main colors in the image and determines the hue parameters of the atmosphere lamp based on the colors. For example, if the main colors in the image are blue and green, the hue parameter is a numerical value corresponding to the blue color system and the green color system. For audio information, a preset color extraction model analyzes the frequency, rhythm, and intensity of the audio and converts these features into luminance parameters. For example, fast-paced music may be translated to higher brightness, while slow-paced music may be translated to lower brightness.
Based on the steps S1211 to S1212, in response to the current scene mode being the entertainment mode or the rest mode, the screen image information and the audio information currently displayed by the vehicle-mounted central control screen are intercepted according to the preset time interval, the screen image information is analyzed by using the preset color extraction model to determine the hue parameter, the audio information is analyzed by using the preset color extraction model to determine the brightness parameter, and the hue parameter of the vehicle-mounted atmosphere lamp can be automatically adjusted according to the current scene mode and the content of the vehicle-mounted central control screen, so that more personalized and comfortable in-vehicle environment is provided for the driver and the passengers.
Optionally, in step S12, analyzing the current scene mode with the preset color extraction model, and determining the initial tone parameter of the vehicle-mounted atmosphere lamp includes:
Step S1221, responding to the current scene mode as a rest mode, and controlling the vehicle-mounted central control screen to display a preset dynamic scene;
specifically, if the current scene mode is a rest mode, the vehicle-mounted central control screen is controlled to display a preset dynamic scene, wherein the preset dynamic scene can be video or animation, and the aim of creating a relaxed and comfortable atmosphere is achieved. Moreover, the preset dynamic scene can be played circularly or can be changed with time so as to provide a visual soothing experience.
Step S1222, extracting a preset number of hue parameters in the dynamic scene by using a preset color extraction model, wherein the preset number is smaller than or equal to the number of the vehicle-mounted atmosphere lamps.
Specifically, a preset color extraction model is used to analyze colors in a dynamic scene. The preset color extraction model is used for extracting a preset number of hue parameters from a dynamic scene, wherein the preset number is smaller than or equal to the number of vehicle-mounted atmosphere lamps. The extracted hue parameter will be used as an initial hue parameter for the vehicle atmosphere lamp. For example, if there are 6 on-board atmosphere lamps, 3 to 6 main hue parameters may be extracted.
Based on the steps S1221 to S1222, the vehicle-mounted central control screen is controlled to display a preset dynamic scene in response to the current scene mode being the rest mode, the preset color extraction model is utilized to extract the hue parameters of the preset quantity in the dynamic scene, the color of the vehicle-mounted atmosphere lamp and the dynamic scene are synchronously changed, the scene sense of fusion is enhanced, the purpose of controlling the real-time interaction between the vehicle-mounted atmosphere lamp and the central control screen is achieved, and therefore the technical effects of improving the flexibility of the vehicle-mounted atmosphere lamp and the driving experience sense are achieved.
Optionally, in step S12, analyzing the current scene mode with the preset color extraction model, and determining the initial tone parameter of the vehicle-mounted atmosphere lamp includes:
Step S1231, responding to the current scene mode as an interaction mode, and acquiring input hue parameters of a vehicle-mounted central control screen;
In step S1231, the input hue parameter is used to characterize the color directly selected by the user, or may be a color automatically generated based on the touch operation of the user on the screen.
Specifically, if the current scene mode is an interactive mode, the user may set the screen ground color of the vehicle-mounted central control screen, select the color presented on the central control screen by using a stylus, a finger or the like, and also may automatically generate the color based on the touch operation of the user on the screen. For example, the user may directly select the color provided by the center control screen, or may adjust the color provided by the center control screen to obtain the desired color.
And S1232, screening the input hue parameters by using a preset color extraction model to determine the hue parameters of the vehicle-mounted atmosphere lamp, and determining the brightness parameters based on the touch frequency and the touch duration of the user on the vehicle-mounted central control screen.
Specifically, a preset color extraction model is utilized, and according to an input hue parameter, the input hue parameter is firstly converted into a hue parameter which can be used for setting an atmosphere lamp. And then screening the input hue parameters to determine hue parameters in the initial hue parameters of the vehicle-mounted atmosphere lamp, and determining brightness parameters based on the touch frequency and touch duration of the user on the vehicle-mounted central control screen.
Specifically, a preset color extraction model is used to analyze hue parameters entered by the user. The dominant or blended color is extracted from the input hue parameters. Screening criteria include, but are not limited to, brightness of color, saturation, degree of match to user preference. And analyzing the touch frequency and the touch duration of the user on the on-vehicle central control screen by using a preset color extraction model. And determining the brightness parameter of the atmosphere lamp according to the touch frequency and the duration. The touch frequency is higher than or equal to a preset frequency threshold value or the touch time length is higher than or equal to a preset duration threshold value, which indicates that the emotion of the user at the moment is more active, so that the brightness of the vehicle-mounted atmosphere lamp is set higher to match the active emotion of the user. The touch frequency is lower than the preset frequency threshold value, and the touch duration is lower than the preset duration threshold value, which indicates that the emotion of the user at the moment is mild, so that the brightness of the vehicle-mounted atmosphere lamp is set to be low to match the mild emotion of the user. The brightness parameter is a gradual change process and is dynamically adjusted according to the touch behavior of the user.
Based on the steps S1231 to S1232, the input hue parameter of the vehicle-mounted central control screen is obtained in response to the fact that the current scene mode is an interaction mode, the input hue parameter is screened by utilizing a preset color extraction model to determine the hue parameter of the vehicle-mounted atmosphere lamp, the brightness parameter is determined based on the touch frequency and the touch duration of the user on the vehicle-mounted central control screen, the real-time visual feedback of each touch of the user can be obtained, the interaction instantaneity and the interaction performance are enhanced, the emotion state of the user can be captured by analyzing the touch frequency and the touch duration of the user, the light brightness is correspondingly adjusted, emotion synchronization is achieved, the aim of controlling the real-time interaction of the vehicle-mounted atmosphere lamp and the central control screen is achieved, and therefore the technical effects of improving the flexibility of the vehicle-mounted atmosphere lamp and the driving experience are achieved.
Optionally, in step S14, adjusting the initial tone parameter based on the preset color policy and the historical tone parameter, to obtain the target tone parameter includes:
step S141, identifying emotion theme information of a current display picture of the vehicle-mounted central control screen by using a preset emotion identification model;
Specifically, a preset emotion recognition model is used for analyzing a picture currently displayed by the vehicle-mounted central control screen. The preset emotion recognition model can recognize emotion topics in the picture based on machine learning, deep learning and other technologies. For example, using CNN in combination with emotion analysis algorithm, emotion theme information such as happiness, sadness, calm, agitation, etc. of the current display screen is identified. After the preset emotion recognition model is recognized, one or more emotion identifications are output.
Step S142, adjusting initial tone parameters based on a preset color strategy and emotion theme information to obtain first tone parameters;
Specifically, the tone parameters are adjusted according to a preset color strategy and emotion theme information. According to emotion theme information identified by a preset emotion identification model, an initial tone parameter is adjusted by combining a color psychology application strategy so as to match an emotion theme. For example, if the emotional topic is happy, the color psychology application policy would suggest using brighter and warmer colors, and if sad, the color psychology application policy would suggest using softer cool hues of colors. The adjusted hue parameter becomes the first hue parameter, which is the result of the preliminary adjustment, and subsequently needs to be optimized in combination with the history data.
Step S143, the first tone parameter is adjusted in combination with the historical tone parameter to obtain the target tone parameter.
Specifically, a user's historical hue parameters associated with the current emotion topic are retrieved, the historical hue parameters reflecting the user's preferences under similar emotion topics in the past. The first hue parameter is optimized in combination with the historical hue parameter to more accurately match the personalized preferences of the user. For example, the brightness, saturation or hue of the color is fine-tuned. The final adjusted hue parameters become target hue parameters that will be used to set the vehicle atmosphere lights to provide an in-vehicle environment that matches the current emotional topic and user preferences. For example, the current scene mode is entertainment mode, and the on-board central control screen displays a movie with sad theme. And intercepting the current picture image displayed in the vehicle-mounted central control screen according to a preset time interval, and extracting audio information. Based on a preset color extraction model, the initial tone parameters of the vehicle-mounted atmosphere lamp are determined to comprise 240 degrees (blue color system), 0.8 degree of saturation and 0.6 degree of brightness. And analyzing the movie picture by using the emotion recognition model, and recognizing that the emotion theme is sad. According to a preset color strategy, determining to use cooler and softer light blue for the sad theme so as to adjust the hue parameters and obtain the first hue parameters. Since it is also retrieved that the user tends to prefer a slightly darker hue when watching a similarly sad subject movie, the brightness in a hue parameter is slightly reduced, resulting in a target hue parameter. Thus, the target hue parameter combined with the user history preferences is able to show a slightly dimmed blue light to enhance the sad atmosphere of the movie and match the user's personalized preferences.
Based on the steps S141 to S143, the emotion theme information of the current display picture of the vehicle-mounted central control screen is identified by using the preset emotion recognition model, the initial tone parameter is adjusted based on the preset color strategy and the emotion theme information to obtain the first tone parameter, the first tone parameter is adjusted by combining the historical tone parameter to obtain the target tone parameter, the emotion theme of the picture is recognized by using the emotion recognition model, the vehicle interior environment with resonance can be provided synchronously with the emotion state of the user, the system can provide lamplight setting more in line with personal preference by combining the historical tone parameter of the user, personalized experience is enhanced, the mood or pacifying mood of the user is improved by using the color psychology principle, and the atmosphere lamp can be dynamically adjusted according to real-time emotion analysis and user interaction data to adapt to different scenes and emotion requirements.
Optionally, in step S16, controlling the vehicle-mounted atmosphere lamp to output the light with the corresponding tone by using the target tone parameter includes:
step S1611, generating a light control signal based on a target tone parameter and a preset segmentation distance in response to the current scene mode being an entertainment mode or a rest mode;
in step S1611, the preset segment distance is used to characterize a gradual change distance of light variation of the vehicle-mounted atmosphere lamp.
Step S1612, a light control signal is sent to the vehicle-mounted atmosphere lamp, and the vehicle-mounted atmosphere lamp is controlled to continuously output light with a corresponding tone.
Specifically, if the current scene mode is an entertainment mode or a rest mode, a light control signal is generated based on the target tone parameter and a preset segmentation distance. The light control signal specifically indicates information such as color brightness, saturation, and the like that each vehicle-mounted atmosphere lamp should display, and a color change mode. Such as gradual changes, flickering, breathing effects, etc. And after receiving the control signal, the vehicle-mounted atmosphere lamp adjusts the color and the brightness of the vehicle-mounted atmosphere lamp according to the indication of the light control signal so as to output light matched with the target tone parameter. At this time, the vehicle-mounted atmosphere lamp continuously outputs the set tone light until a new light control signal or a scene mode change is received. For example, a target tone parameter has been determined. And generating a light control signal according to the target tone parameter and a preset segmentation distance (the vehicle-mounted atmosphere lamp is divided into a front section, a middle section and a rear section). The light control signal indicates that the front-section vehicle-mounted atmosphere lamp displays brighter blue, the middle section displays medium-brightness blue, and the rear section displays darker blue. And sending the control signal to the vehicle-mounted atmosphere lamp. After the vehicle-mounted atmosphere lamp receives the signal, the front section atmosphere lamp displays bright blue light, the middle section displays blue light with medium brightness, the rear section displays darker blue light, and the stage lighting effect of the concert is simulated. The vehicle atmosphere lamp continuously outputs the lights with the hues until the video ends or the user changes the scene mode.
Based on the steps S1611 to S1612, the current scene mode is an entertainment mode or a rest mode, a light control signal is generated based on the target tone parameter and the preset segmentation distance, the light control signal is sent to the vehicle-mounted atmosphere lamp, the vehicle-mounted atmosphere lamp is controlled to continuously output light with a corresponding tone, the atmosphere experience in the vehicle can be enhanced by adjusting the light according to the target tone parameter and the scene mode, and the gradual change effect and proper brightness adjustment of the light are beneficial to reducing visual fatigue and providing a more comfortable visual environment.
Optionally, in step S16, controlling the vehicle-mounted atmosphere lamp to output the light with the corresponding tone by using the target tone parameter includes:
Step S1621, responding to the current scene mode as an interactive mode, and generating a light control signal based on the target tone parameter and the preset segmentation distance;
And step S1622, transmitting a light control signal to the vehicle-mounted atmosphere lamp to control the vehicle-mounted atmosphere lamp to circularly output light with a corresponding tone.
Specifically, if the current scene mode is an interactive mode, a light control signal is generated based on the target tone parameter and a preset segmentation distance. Wherein the preset segmentation distance may be set based on the layout of the vehicle-mounted atmosphere lamps. And generating a light control signal according to the target tone parameter and the segmentation distance. The light control signals will direct each of the ambiance light segments to display a specific color and brightness, as well as possibly dynamic effects, such as color gradual changes, flickering or breathing effects, etc. And sending the generated light control signals to each vehicle-mounted atmosphere lamp through a vehicle-mounted network. After receiving the light control signal, the vehicle-mounted atmosphere lamp adjusts the color and the brightness of the vehicle-mounted atmosphere lamp according to the light control signal to match the target tone parameter, wherein the vehicle-mounted atmosphere lamp continuously and circularly outputs the set tone light so as to support the dynamic environment in the interaction mode. The cyclical output may be a constant change in color or may be a periodic repetition of the light effect.
Based on the steps S1621 to S1622, the current scene mode is an interactive mode, a light control signal is generated based on the target tone parameter and the preset segmentation distance, the light control signal is sent to the vehicle-mounted atmosphere lamp, the vehicle-mounted atmosphere lamp is controlled to circularly output light with a corresponding tone, the atmosphere experience in the vehicle can be enhanced by adjusting the light according to the target tone parameter and the scene mode, and the gradual change effect and proper brightness adjustment of the light are beneficial to reducing visual fatigue and providing a more comfortable visual environment.
Optionally, in response to the current scene mode being an interactive mode, the vehicle is in a stationary state.
Specifically, if the current scene mode is selected as the interactive mode, the vehicle must be in a stationary state to avoid the interactive mode from affecting the driver, so that the vehicle is in a safe state.
Based on responding to the current scene mode as the interaction mode, the vehicle is in a static state so as to avoid the interaction mode affecting a driver, and the vehicle is in a safe state.
Fig. 2 is a flowchart of a control method of a vehicle-mounted atmosphere lamp according to still another embodiment of the present invention, as shown in fig. 2, the method includes the steps of:
Step S201, a current scene mode of a vehicle-mounted central control screen is obtained, wherein the current scene mode comprises one of an entertainment mode, a rest mode and an interaction mode;
step S202, in response to the current scene mode being an entertainment mode, capturing image information and audio information of a picture currently displayed by a vehicle-mounted central control screen according to a preset time interval;
step S203, analyzing the picture image information by using a preset color extraction model to determine hue parameters, and analyzing the audio information by using a preset color extraction model to determine brightness parameters;
Step S204, responding to the current scene mode as a rest mode, and controlling the vehicle-mounted central control screen to display a preset dynamic scene;
step S205, extracting a preset number of hue parameters in the dynamic scene by using a preset color extraction model;
step S206, responding to the current scene mode as an interaction mode, and acquiring input hue parameters of the vehicle-mounted central control screen;
Step S207, screening the input hue parameters by using a preset color extraction model to determine the hue parameters of the vehicle-mounted atmosphere lamp;
Step S208, determining brightness parameters based on the touch frequency and the touch duration of a user on the vehicle-mounted central control screen;
step S209, identifying emotion theme information of a current display picture of the vehicle-mounted central control screen by using a preset emotion identification model;
Step S210, adjusting initial tone parameters based on a preset color strategy and emotion theme information to obtain first tone parameters;
step S211, adjusting the first tone parameter by combining the historical tone parameter to obtain a target tone parameter;
step S212, generating a light control signal based on the target tone parameter and the preset segmentation distance in response to the current scene mode being an entertainment mode or a rest mode;
Step S213, a light control signal is sent to the vehicle-mounted atmosphere lamp, and the vehicle-mounted atmosphere lamp is controlled to continuously output light with a corresponding tone;
Step S214, generating a light control signal based on the target tone parameter and the preset segmentation distance in response to the current scene mode being an interactive mode;
And step S215, a light control signal is sent to the vehicle-mounted atmosphere lamp, and the vehicle-mounted atmosphere lamp is controlled to circularly output light with a corresponding tone.
Based on the above steps S201 to S215, in the embodiment of the present invention, a manner of obtaining the current scene mode of the vehicle-mounted central control screen is adopted, a preset color extraction model is utilized to analyze the current scene mode, an initial tone parameter of the vehicle-mounted atmosphere lamp is determined, the initial tone parameter is adjusted based on a preset color strategy and a history tone parameter, a target tone parameter is obtained, and finally the vehicle-mounted atmosphere lamp is controlled to output light with a corresponding tone by using the target tone parameter, so that the purpose of controlling the real-time interaction between the vehicle-mounted atmosphere lamp and the central control screen is achieved, thereby realizing the technical effects of improving the flexibility of the vehicle-mounted atmosphere lamp and the driving experience, and further solving the technical problem that the driving experience is poor because the vehicle-mounted atmosphere lamp cannot realize real-time interaction with the central control screen in the related art.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiment of the invention also provides a control device of the vehicle-mounted atmosphere lamp, which is used for realizing the embodiment and the preferred implementation mode, and the description is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 3 is a block diagram of a control apparatus of an in-vehicle atmosphere lamp according to an embodiment of the present invention. As shown in fig. 3, the apparatus includes:
The acquiring module 301 is configured to acquire a current scene mode of the in-vehicle central control screen, where the current scene mode includes one of an entertainment mode, a rest mode, and an interaction mode;
the determining module 302 is configured to analyze the current scene mode by using a preset color extraction model analysis, and determine an initial tone parameter of the vehicle-mounted atmosphere lamp;
the adjusting module 303 is configured to adjust the initial tone parameter based on a preset color policy and a historical tone parameter to obtain a target tone parameter;
And the control module 304 is used for controlling the vehicle-mounted atmosphere lamp to output the lamp light with the corresponding tone by utilizing the target tone parameter.
Optionally, the obtaining module 301 is further configured to intercept, at preset time intervals, image information and audio information of a picture currently displayed by the on-vehicle central control screen in response to the current scene mode being an entertainment mode or a rest mode, and the determining module 302 is further configured to analyze the image information to determine hue parameters by using a preset color extraction model, and analyze the audio information to determine brightness parameters by using the preset color extraction model.
Optionally, the control module 304 is further configured to control the on-vehicle central control screen to display a preset dynamic scene in response to the current scene mode being a rest mode, and the determining module 302 is further configured to extract a preset number of hue parameters in the dynamic scene by using a preset color extraction model, where the preset number is less than or equal to the number of on-vehicle atmosphere lamps.
Optionally, the obtaining module 301 is further configured to obtain an input hue parameter of the vehicle-mounted central control screen in response to the current scene mode being an interaction mode, and the determining module 302 is further configured to screen the input hue parameter by using a preset color extraction model, determine a hue parameter of the vehicle-mounted atmosphere lamp, and determine a brightness parameter based on a touch frequency and a touch duration of a user on the vehicle-mounted central control screen.
Optionally, the control device of the vehicle-mounted atmosphere lamp further comprises a recognition module 305 for recognizing emotion theme information of a current display picture of the vehicle-mounted central control screen by using a preset emotion recognition model, wherein the adjustment module 303 is further used for adjusting initial tone parameters based on a preset color strategy and the emotion theme information to obtain first tone parameters, and adjusting the first tone parameters in combination with the historical tone parameters to obtain target tone parameters.
Optionally, the control module 304 is further configured to generate a light control signal based on the target tone parameter and the preset segmentation distance in response to the current scene mode being an entertainment mode or a rest mode, and send the light control signal to the vehicle-mounted atmosphere lamp to control the vehicle-mounted atmosphere lamp to continuously output light with a corresponding tone.
Optionally, the control module 304 is further configured to generate a light control signal based on the target tone parameter and the preset segment distance in response to the current scene mode being an interactive mode, and send the light control signal to the vehicle-mounted atmosphere lamp to control the vehicle-mounted atmosphere lamp to circularly output light with a corresponding tone.
Optionally, in response to the current scene mode being an interactive mode, the vehicle is in a stationary state.
It should be noted that each of the above modules may be implemented by software or hardware, and the latter may be implemented by, but not limited to, the above modules all being located in the same processor, or each of the above modules being located in different processors in any combination.
According to one embodiment of the invention, the electronic equipment further comprises a memory and a processor, wherein the memory stores an executable program, and the processor is used for running the program, and the control method of the vehicle-mounted atmosphere lamp is executed when the program runs.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
Step S1, acquiring a current scene mode of a vehicle-mounted central control screen, wherein the current scene mode comprises one of an entertainment mode, a rest mode and an interaction mode;
S2, analyzing a current scene mode by using a preset color extraction model, and determining initial tone parameters of the vehicle-mounted atmosphere lamp;
step S3, adjusting the initial tone parameters based on a preset color strategy and the historical tone parameters to obtain target tone parameters;
And S4, controlling the vehicle-mounted atmosphere lamp to output lamplight with a corresponding tone by using the target tone parameter.
According to an embodiment of the present invention, there is further provided a computer readable storage medium, the computer readable storage medium including a stored executable program, wherein when the executable program runs, a device where the storage medium is controlled to execute the above-mentioned control method for the vehicle-mounted atmosphere lamp.
Alternatively, in the present embodiment, the above-described storage medium may be configured to store a computer program for performing the steps of:
Step S1, acquiring a current scene mode of a vehicle-mounted central control screen, wherein the current scene mode comprises one of an entertainment mode, a rest mode and an interaction mode;
S2, analyzing a current scene mode by using a preset color extraction model, and determining initial tone parameters of the vehicle-mounted atmosphere lamp;
step S3, adjusting the initial tone parameters based on a preset color strategy and the historical tone parameters to obtain target tone parameters;
And S4, controlling the vehicle-mounted atmosphere lamp to output lamplight with a corresponding tone by using the target tone parameter.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to, a USB flash disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, etc. various media in which a computer program may be stored.
According to an embodiment of the present invention, there is also provided a computer program product comprising a computer program which, when executed by a processor, implements the above-mentioned method for controlling an on-board atmosphere lamp.
Alternatively, in the present embodiment, the above-described computer program product may be provided as a computer program that performs the steps of:
Step S1, acquiring a current scene mode of a vehicle-mounted central control screen, wherein the current scene mode comprises one of an entertainment mode, a rest mode and an interaction mode;
S2, analyzing a current scene mode by using a preset color extraction model, and determining initial tone parameters of the vehicle-mounted atmosphere lamp;
step S3, adjusting the initial tone parameters based on a preset color strategy and the historical tone parameters to obtain target tone parameters;
And S4, controlling the vehicle-mounted atmosphere lamp to output lamplight with a corresponding tone by using the target tone parameter.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments and optional implementations, and this embodiment is not described herein.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. The storage medium includes a U disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, etc. which can store the program code.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.