Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, thedisplay apparatus 200 is also in data communication with aserver 400, and a user can operate thedisplay apparatus 200 through thesmart device 300 or thecontrol device 100.
In some embodiments, thecontrol apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls thedisplay device 200 in a wireless or wired manner. The user may control thedisplay apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, thesmart device 300 may include any of a mobile terminal, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, thesmart device 300 may also be used to control thedisplay device 200. For example, thedisplay device 200 is controlled using an application running on the smart device.
In some embodiments, thesmart device 300 and the display device may also be used for communication of data.
In some embodiments, thedisplay device 200 may also be controlled in a manner other than thecontrol apparatus 100 and thesmart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside thedisplay device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside thedisplay device 200.
In some embodiments, thedisplay device 200 is also in data communication with aserver 400. Thedisplay device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. Theserver 400 may provide various contents and interactions to thedisplay apparatus 200. Theserver 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of thecontrol apparatus 100 according to an exemplary embodiment. As shown in fig. 2, thecontrol device 100 includes acontroller 110, acommunication interface 130, a user input/output interface 140, a memory, and a power supply. Thecontrol apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by thedisplay device 200, serving as an interaction intermediary between the user and thedisplay device 200.
In some embodiments, thecommunication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of thedisplay apparatus 200 according to an exemplary embodiment.
In some embodiments, thedisplay apparatus 200 includes at least one of atuner demodulator 210, acommunicator 220, a detector 230, anexternal device interface 240, acontroller 250, adisplay 260, anaudio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, thedisplay 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, thedisplay 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, thetuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments,communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. Thedisplay apparatus 200 may establish transmission and reception of control signals and data signals with thecontrol device 100 or theserver 400 through thecommunicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, theexternal device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, thecontroller 250 and themodem 210 may be located in different separate devices, that is, themodem 210 may also be located in an external device of the main device where thecontroller 250 is located, such as an external set-top box.
In some embodiments, thecontroller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. Thecontroller 250 controls the overall operation of thedisplay apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on thedisplay 260, thecontroller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to an icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application instructions stored in the memory, and executing various applications, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on thedirect display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed ondisplay 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments,user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and inter-thread communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application. The application is compiled into machine code after being started, forming a thread.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are, from top to bottom, an APPlication (apptications) layer (referred to as an "APPlication layer"), an APPlication Framework (apptication Framework) layer (referred to as a "Framework layer"), an Android runtime (Android runtime) and system library layer (referred to as a "system runtime library layer"), and a kernel layer.
In some embodiments, at least one application runs in the application layer, and the applications may be a Window (Window) program carried by an operating system, a system setting program, a clock program or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an APPlication Programming Interface (API) and a programming framework for applications of the APPlication layer. The application framework layer includes some predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. Through the API interface, the application can access the resources in the system and obtain the services of the system in execution.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications and the general navigation fallback functions, such as controlling exit, open, fallback, etc. of the applications. The window manager is used for managing all window programs, such as acquiring the size of a display screen, judging whether a status bar exists, locking the screen, capturing the screen, controlling the change of the display window (for example, reducing the display window, shaking the display, distorting the display, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
In some embodiments, the display device may directly enter the interface of the preset vod program after being activated, and the interface of the vod program may include at least anavigation bar 510 and a content display area located below thenavigation bar 510, as shown in fig. 5, where the content displayed in the content display area may change according to the change of the selected control in the navigation bar. The program in the application layer can be integrated in the video-on-demand program and displayed through one control of the navigation bar, and can also be further displayed after the application control in the navigation bar is selected.
In some embodiments, the display device may directly enter a display interface of a signal source selected last time after being started, or a signal source selection interface, where the signal source may be a preset video-on-demand program, or may be at least one of an HDMI interface, a live tv interface, and the like, and after a user selects different signal sources, the display may display contents obtained from different signal sources. May be used.
The scheme display device shown in this embodiment may have a function of resource sharing, where one end of the shared resource is referred to as a Source end (also referred to as a screen projection end in this embodiment), and one end of the display resource is referred to as a Sink end (also referred to as a display end in this embodiment). In the resource sharing process, a Source end needs to capture a video frame of a resource, and then the captured video is compressed and encoded and then transmitted to a Sink end. And the Sink end decompresses the received compressed file and finally displays the decompressed picture.
However, a phenomenon is often encountered in the screen projection process, the Source terminal picture displays a screen at the Sink terminal which cannot be normally displayed, the periphery of the displayed picture at the Sink terminal is black, and the user experience is poor.
In order to solve the above technical problem, an embodiment of the present application illustrates a display device, where the display device is suitable for a display end, and an interaction process between the display end and a screen projection end may refer to fig. 6.
Wherein the screen projection end is configured to perform step S101 to transmit a screen image;
in the scheme shown in this embodiment, the screen image is an image obtained by screen capturing of the display interface of the screen projection terminal. When the resolution of the media asset image displayed by the screen projection end is inconsistent with the resolution of the display of the screen projection end, black edges appear around the screen projection image of the screen projection end. According to the technical scheme shown in the embodiment of the application, the screen image is divided into the black block image and the asset image, wherein the asset image is a picture of the display equipment playing the asset, and the black block image is an image around the asset image.
In the technical solution shown in this embodiment, the display end may include a display (which may be referred to as a display end display in this embodiment for convenience of distinction) and a controller (which may be referred to as a display end controller in this embodiment for convenience of distinction);
in response to receiving the screen image transmitted by the screen projection terminal, the display terminal controller is configured to execute step S102 to delete the black block image of the screen image, so as to obtain a media asset image;
in the practical application process, there are various implementation manners for identifying the black block image. For example, in some possible implementations, the presentation side controller may pre-store black block image pixel values. And in response to receiving the screen image transmitted by the screen projection end, the display end controller traverses the pixel value of each pixel point of the screen image, and determines the image with the pixel value equal to the pixel value of the black block image as the black block image. The embodiment is merely an exemplary way to describe an implementation manner of identifying a black block image, and in the process of practical application, the implementation manner of identifying a black block image may be, but is not limited to, the above manner.
The display end controller is configured to execute step S103 to zoom the media asset image according to the resolution of the display end display to obtain a screen projection image;
the scaling multiple of the media asset image in the technical scheme shown in the embodiment of the application is determined by the ratio of the display resolution of the display end to the media asset image resolution. Specifically, in a feasible embodiment, the presentation end controller may calculate a ratio of a resolution of the presentation end display in the width direction to a resolution of the media asset image in the width direction, to obtain a first ratio; meanwhile, the display end controller can calculate the ratio of the resolution of the display end display in the high direction to the resolution of the media asset image in the high direction to obtain a second ratio; and the display end controller selects a smaller value from the first ratio and the second ratio as a scaling factor.
For example, in a possible embodiment, the resolution of the media image is 1080 × 360, and the resolution of the display end display is 1920 × 1080, in this embodiment, the first ratio value 1920/1080-1.7 and the second ratio value 1080/360-3 are calculated by the display end controller. In this embodiment, 1.7 is selected as a zoom factor, the presentation end controller controls the media asset image to be zoomed in by 1.7 times, so as to obtain a media asset image 1920 × 640, and the presentation end controller controls the display to present the media asset image 1920 × 640.
S104, controlling the display to display the screen projection image.
The display device shown in this embodiment is further described below with reference to specific examples:
referring to fig. 7, fig. 7 is a schematic diagram illustrating a change of a screen where a media asset image is located in a screen projection process according to a feasible embodiment. In this embodiment, the screen projection end is a display device displayed in a vertical screen, and the resolution of the screen projection end is 1080 × 1920, which can be specifically referred to as a schematic diagram 11 in fig. 7; in response to a screen capture instruction input by a user, capturing a screen image at a resolution of 1080 × 1920 by the screen projection end, which can be specifically referred to as a schematic diagram 12 in fig. 7; the screen projection end sends the resolution of the screen image to be 1080 × 1920 to the display end, so that the display end can set a transmission protocol based on the resolution of the screen shot image (the transmission protocol at least comprises a compression resolution). In the embodiment, the compression resolution is 430 × 1080, so the resolution of the screenshot image transmitted to the display end is 430 × 1080, which can be specifically referred to as the schematic diagram 13 in fig. 7; the display end controller deletes the black block image of the screen image to obtain a media asset image, wherein the resolution of the media asset image is 430 × 144, which can be specifically referred to as a schematic diagram 14 in fig. 7; the display end controller calculates the zoom factor of the media asset image to be 4.4 according to the ratio of the resolution of the display end display to the resolution of the media asset image, and controls the media asset image to be amplified by 4.4 times to obtain a screen projection image; finally, the display at the display end is controlled to display the projection image, which can be specifically referred to as a schematic diagram 15 in fig. 7.
The display device that this application embodiment shows, display device is applicable to the show end, includes: a display and a controller. When the controller receives the screen image, the controller can identify the screen image and delete the black block image of the screen image to obtain a media asset image; and finally, zooming the media asset image according to the resolution of the display to obtain a screen projection image. The display device shown in this embodiment deletes the black block image of the screen image before screen projection to obtain a screen projection image; and finally, zooming the screen projection image, wherein the black block image around the screen projection image is removed, so that the screen projection image can be displayed at the display end, and the user experience feeling is better without black edges on at least two sides.
A second aspect of the embodiment of the present application shows a display device, where the display device is suitable for a screen projection end, and an interaction process between a display end and the screen projection end may refer to fig. 8;
in response to a screen projection starting instruction input by a user, the screen projection terminal is configured to execute step S201 to capture a screen image;
the implementation manner of capturing the screen image may adopt an image capturing manner which is customary in the art, and the applicant does not make much limitation here.
The screen projection terminal is configured to execute step S202 to delete the black block image of the screen image to obtain a media asset image;
the implementation of deleting the black block image in the screen image may refer to the above embodiments, and the applicant does not repeat here.
The screen projection end is configured to execute step S203 to output the media asset image to the display end, so that the display end scales the media asset image according to the resolution of the display end.
In this embodiment, the transmission mode of the media asset image may adopt a data transmission mode commonly used in the art, for example, bluetooth transmission, network transmission, and the like.
In this embodiment, the display end includes a display (referred to as a display end display in this embodiment for convenience of distinction) and a controller (referred to as a display end display in this embodiment for convenience of distinction).
In response to receiving the screen image transmitted by the screen projection terminal, the display terminal controller is configured to execute step S204 to zoom the media asset image according to the resolution of the display terminal display, so as to obtain a screen projection image;
the implementation manner of the display end controller scaling the media asset image according to the resolution of the display end display may refer to the above embodiments, and details are not described in this application.
The display end controller is configured to execute step S205 to control the display to display the screen shot image.
The display device shown in this embodiment is further described below with reference to specific examples:
referring to fig. 9, fig. 9 is a schematic diagram illustrating a change of a screen where a media asset image is located in a screen projection process according to a feasible embodiment. In this embodiment, the screen projection end is a display device displayed in a vertical screen, and the resolution of the display at the screen projection end is 1080 × 1920, which can be specifically referred to as a schematic diagram 21 in fig. 9; in response to a screen capture instruction input by a user, the screen projection end controller captures a screen image, wherein the resolution of the screen image is 1080 × 1920, and particularly, the schematic diagram 22 in fig. 9 can be referred to; the screen projection controller deletes the black block image of the screen image to obtain a media asset image, where the resolution of the media asset image is 1080 × 360, which can be specifically referred to as a schematic diagram 23 in fig. 9. The screen projection end controller sends the resolution of the asset image 430 × 144 to the display end, so that the display end can set a transmission protocol (the transmission protocol at least includes a compression resolution) based on the resolution of the asset image, where the compression resolution is 1080 × 360 in this embodiment, and therefore the resolution of the asset image transmitted to the display end is 1080 × 360, which can be specifically referred to the schematic diagram 24 in fig. 9; the display end controller calculates the zoom factor of the media asset image to be 1.7 according to the ratio of the resolution of the display end display to the resolution of the media asset image, and controls the media asset image to be magnified by 1.7 times to obtain a screen projection image; finally, the display at the display end is controlled to display the projection image, which can be specifically referred to as a schematic diagram 25 in fig. 9.
The display device that this application embodiment shows, display device is applicable to and throws the screen end, includes: a display and a controller. In response to a screen projection starting instruction input by a user, a controller firstly captures a screen image; and then, deleting the black block image of the screen image to obtain a media asset image. The display device shown in this embodiment deletes the black block image of the screen image before performing image transmission with the screen projection end, so as to obtain a media asset image; and finally, transmitting the media asset image to a screen projection end so that the screen projection end can zoom the media asset image. In the display device shown in this embodiment, because the black block images around the media asset images are removed, the media asset images are zoomed to obtain the display images without the black block images around the display images, so that it can be ensured that at least two sides of the projection images have no black side and the user experience is better when the projection images are displayed at the display end.
In the practical application process, most of the display terminals are home televisions, and the display direction of the display of the home television is a horizontal display direction in general. In the above application scenario, in order to reduce the data processing amount of the display side, the embodiment of the present application illustrates a method for processing a screen image, specifically, referring to fig. 10, fig. 10 is a flowchart of a screen image processing method according to a feasible embodiment, the method is applicable to a projection side controller, wherein the projection side controller is further configured to execute steps S11 to S131/S132.
Step S11 reads the aspect ratio of the screen image;
there are various implementations of reading the aspect ratio of the screen image, for example, in some feasible embodiments, the projection terminal controller may generate the aspect ratio of the screen image according to the resolution of the screen image. For example, in a possible embodiment, the resolution of the screen image is 1080 × 1920, and the aspect ratio of the screen image is 1080/1920. For another example, in some feasible embodiments, the aspect ratio of the projection-side display may be stored in advance, and the aspect ratio of the projection-side display may be directly called as the aspect ratio of the screen image when the screen image is generated. For example, in a possible embodiment, the aspect ratio of the projection display is 1920/1080, and the aspect ratio of the screen image is 1920/1080.
Step S12 determines whether the aspect ratio is greater than 1;
the implementation of determining whether the aspect ratio is greater than 1 may be a numerical determination conventionally used in the art, and the applicant does not make much limitation herein.
If the aspect ratio is greater than 1, step S131 outputs the screen image to a display end;
in this embodiment, the transmission mode of the media asset image may adopt a data transmission mode commonly used in the art, for example, bluetooth transmission, network transmission, and the like.
If the aspect ratio is less than or equal to 1, step S132 deletes the black block image of the screen image to obtain a media asset image.
The implementation manner of deleting the black block image of the screen image may refer to the above embodiments, and is not described in detail by the applicant herein.
It can be seen that in the scheme shown in this embodiment, the screen projection end controller determines in advance whether the screen image needs to be cropped according to the aspect ratio of the screen image in the scene of generating the screen image. When the aspect ratio of the screen image is larger than 1 in an application scene, the screen image is a horizontal screen display image, and the display direction of the display end display is also the horizontal screen display direction, in this case, even if the screen image is not cut; the final screen image is also well displayed on a display at the display end; further, in the process, the screen image is not cut by the screen projection controller, so that the data processing amount of the screen projection controller is reduced.
In some application scenarios, if the screen projection end is in a media asset playing state, the screen projection end needs to capture and take a plurality of screen images continuously in response to a screen projection starting instruction input by a user. In order to ensure that the display device shown in this embodiment is suitable for the application scenario, this embodiment shows an implementation manner of generating the screen image capturing time, and specifically, referring to fig. 11, fig. 11 is a flowchart illustrating an implementation manner of generating the screen image capturing time according to a feasible embodiment. Wherein if the display apparatus is in the media asset play state, the controller is configured to perform steps S21-S22.
Responding to a screen casting starting instruction input by a user, executing a step S21 to read the type of the currently played media assets;
there are various ways to read the type of the currently played media assets. For example, in some feasible embodiments, the asset type of the asset is determined based on a suffix of the currently playing asset.
S22 determines the time of screen image capture according to the asset type.
Specifically, if the asset type is non-video, the implementation of the screen capturing by the screen-throwing controller may refer to fig. 12, wherein the controller is further configured to execute step S221 to capture a frame of screen image in response to the switching of the presentation assets.
For example, in a feasible embodiment, the screen projection end is in a photo playing stage, the screen projection end controller switches one photo each time in response to receiving a screen projection starting instruction, and the screen projection end controller captures one frame of screen image until the screen projection end controller exits the screen projection function.
If the asset type is video, an implementation of the controller capturing a screen image may refer to fig. 13, wherein the controller is further configured to perform step S222 of capturing a frame of screen image at intervals of a preset time.
In this embodiment, the preset time may be set according to the requirement, and the applicant does not make much limitation herein. For example, in a feasible embodiment the preset time may be 5 ms.
For example, in a feasible embodiment, the screen projection end is in a video playing stage, and the screen projection end controller responds to receiving a screen projection starting instruction, and the screen projection end controller does not capture one frame of screen image at an interval of 5ms until the screen projection end controller exits the screen projection function.
If the asset type is a video, in order to further reduce the data processing amount of the controller, an embodiment of the present application shows a generation manner of asset images, specifically, referring to fig. 14, fig. 14 is a flowchart of the generation manner of asset images according to a feasible embodiment, and the method is applicable to a screen-casting side controller, wherein the screen-casting side controller is further configured to execute steps S31 to S32.
S31, dividing a display area of the screen projection end display into an effective area and an invalid area, wherein the effective area is an area corresponding to a media resource image of a first frame of the screen image, the invalid area is an area corresponding to a black block image of the first frame of the screen image, and the display area corresponds to the screen image;
under normal conditions, the display area of the projection display corresponds to the screen image; specifically, referring to fig. 15, fig. 15 is a schematic diagram illustrating a screen-projection end display interface and a screen image according to a possible embodiment. The display area of the projection end display can refer to the schematic diagram 31 in fig. 15, and the screen image can refer to the schematic diagram 32 in fig. 15. The screen projection end controller can identify a captured first frame of screen image, and identify a black block image and a media asset image; then, the display area is divided into an effective area and an ineffective area according to the corresponding relationship between the display area and the screen image, and the specific division effect can be seen from the schematic diagram 33 in fig. 15.
S32, deleting the images corresponding to the invalid areas in each frame of screen image in sequence to obtain the media asset images.
Since the application scene shown in this embodiment is that the asset type is a video, in the process of playing the video, the resolutions corresponding to each frame of asset image are all consistent, and in this scene, the screen projection controller does not need to identify the screen image captured each time, and can delete the image corresponding to the invalid area according to the corresponding relationship between the display area of the screen projection and the screen image, so as to obtain the asset image.
The technical solution shown in this embodiment is further described below with reference to specific examples,
referring to fig. 16, fig. 16 is a schematic diagram illustrating a change process of a screen image captured by a screen-projecting end in a photo playing process according to a possible embodiment. Specifically, in the embodiment, the screen projection end is in the photo playing stage, and the screen projection end controller captures a frame of screen image in response to receiving the screen projection start instruction, which can specifically refer to the schematic diagram 41 in fig. 16. When the screen-end controller switches the photos, triggering the screen-end controller to capture the screen image for the second time specifically can refer to the schematic diagram 42 in fig. 16; when the screen-end controller switches the photos, triggering the screen-end controller to capture the screen image for the third time specifically can refer to the schematic diagram 43 in fig. 16; when the screen-end controller switches the photos, triggering the screen-end controller to capture the screen image for the fourth time may be performed in sequence according to the schematic diagram 44 … … in fig. 16 until the screen-end controller exits the screen-projection function.
Referring to fig. 17, fig. 17 is a diagram illustrating a change process of a screen image captured by a screen-projecting end in a photo playing process according to a feasible embodiment. Specifically, in this embodiment, the screen projection end is in a video playing stage, the screen projection end controller determines that the currently played media asset type is a video in response to receiving a screen projection start instruction, and the screen projection end controller captures a frame of screen image every 5ms, for example, the screen image captured by the screen projection end controller in the 5 th ms may refer to a schematic diagram 51 in fig. 17; the screen image captured by the projection end controller at the 10 th ms can be seen in the schematic diagram 52 in fig. 17; the screen image captured by the screen-end controller at 15ms can be seen in the schematic diagram 53 in fig. 17; the screen images captured by the projection end controller at 20ms may be continued in sequence, as shown in the diagram 54 … … in fig. 17, until the projection end controller exits the projection function.
In some feasibility, in order to further reduce the data processing amount of the controller, the embodiment of the present application illustrates a black block image identification manner, specifically, referring to fig. 18, where fig. 18 is a flowchart illustrating the black block image identification manner according to a feasible embodiment, and the controller is further configured to perform steps S41 to S45;
s41 recognizing an image of the screen image downward from the top of the screen image;
s42, in response to the recognition of the non-black block image, recording a first position, wherein the first position is a position corresponding to the top edge of the non-black block image in the screen image;
s43 recognizing an image of the screen image upward from a bottom end of the screen image;
s44, in response to the recognition of the non-black block image, recording a second position, wherein the second position is a position corresponding to the bottom edge of the non-black block image in the screen image;
s45 determines that the image corresponding to the first position to the top of the screen image is a black block image, and the image corresponding to the second position to the bottom of the screen image is a black block image.
The following describes the recognition method of the black block image with reference to specific examples. Fig. 19 is a schematic diagram of a screen image shown according to a possible embodiment. After capturing a screen image, the projection terminal controller may identify an image of the screen image from the top of the screen image downwards, and record a first position in response to identifying a non-black block image, where the first position is a boundary position between the black block image and the non-black block image, as can be seen from fig. 19. In response to the completion of the recording of the first position, the screen projection end controller in the technical solution shown in this embodiment does not continuously recognize the screen image downward, but recognizes the image of the screen image upward from the bottom end of the screen image, and in response to recognizing the non-black block image, records a second position, where the second position is a boundary position between the black block image and the non-black block image in this embodiment. And finally, the screen projection end controller determines that the image corresponding to the first position to the top end of the screen image is a black block image. It can be seen that the screen projection controller adopting the black block image identification method shown in this embodiment does not need to identify the whole frame of screen image, and reduces the data processing amount of the screen projection controller to a certain extent.
In some feasibility, in order to further reduce the data processing amount of the controller, the embodiment of the present application illustrates a black block image identification manner, specifically, referring to fig. 20, where fig. 20 is a flowchart illustrating the black block image identification manner according to a feasible embodiment, and the controller is further configured to perform steps S51 to S55;
s51 recognizing an image of the screen image upward from a bottom end of the screen image;
s52, in response to the recognition of the non-black block image, recording a second position, wherein the second position is a position corresponding to the bottom edge of the non-black block image in the screen image;
s53 recognizing an image of the screen image downward from the top of the screen image;
s54, in response to the recognition of the non-black block image, recording a first position, wherein the first position is a position corresponding to the top edge of the non-black block image in the screen image;
s55 determines that the image corresponding to the first position to the top of the screen image is a black block image, and the image corresponding to the second position to the bottom of the screen image is a black block image.
It can be seen that the screen projection controller adopting the black block image identification method shown in this embodiment does not need to identify the whole frame of screen image, and reduces the data processing amount of the screen projection controller to a certain extent.
In some feasibility, in order to further reduce the data processing amount of the controller, the embodiment of the present application illustrates a black block image identification manner, specifically, referring to fig. 21, where fig. 21 is a flowchart illustrating the black block image identification manner according to a feasible embodiment, and the controller is further configured to perform steps S61 to S65;
s61 recognizing an image of the screen image from the left side to the right side of the screen image;
s62, in response to identifying the non-black block image, records a third position, where the right edge of the non-black block image is at a corresponding position of the screen image.
S63 recognizing an image of the screen image to the left from the right side of the screen image;
s64, in response to the fact that the non-black block image is identified, recording a fourth position, wherein the fourth position is a position, corresponding to the screen image, of the left edge of the non-black block image;
s65 determines that the image corresponding to the third position to the left boundary of the screen image is a black block image, and the image corresponding to the fourth position to the right boundary of the screen image is a black block image.
The following describes the recognition method of the black block image with reference to specific examples. Fig. 22 is a schematic view of a screen image shown according to a possible embodiment. After capturing a screen image, the projection terminal controller may identify an image of the screen image from the left side of the screen image to the right side, and record a third position in response to identifying a non-black block image, where the third position is a boundary position between the black block image and the non-black block image, as can be seen from fig. 22. In response to the completion of the recording of the third position, the screen projection controller according to the present embodiment does not continue to recognize the screen image to the right, but recognizes the image of the screen image to the left from the right side of the screen image; and recording a fourth position in response to the recognition of the non-black block image, wherein the fourth position is a boundary position of the black block image and the non-black block image in the embodiment. And finally, the screen projection end controller determines that the image corresponding to the first position to the top end of the screen image is a black block image. It can be seen that the screen projection controller adopting the black block image identification method shown in this embodiment does not need to identify the whole frame of screen image, and reduces the data processing amount of the screen projection controller to a certain extent.
In some feasibility, in order to further reduce the data processing amount of the controller, the embodiment of the present application illustrates a black block image identification manner, specifically, referring to fig. 23, where fig. 23 is a flowchart illustrating the black block image identification manner according to a feasible embodiment, and the controller is further configured to perform steps S71 to S75;
s71 recognizing an image of the screen image to the left from the right side of the screen image;
s72, in response to the fact that the non-black block image is identified, recording a fourth position, wherein the fourth position is a position, corresponding to the screen image, of the left edge of the non-black block image;
s73 recognizing an image of the screen image from the left side to the right side of the screen image;
s74, responding to the recognition of the non-black block image, recording a third position, wherein the third position is the position corresponding to the right edge of the non-black block image in the screen image
S75 determines that the image corresponding to the third position to the left boundary of the screen image is a black block image, and the image corresponding to the fourth position to the right boundary of the screen image is a black block image.
It can be seen that the screen projection controller adopting the black block image identification method shown in this embodiment does not need to identify the whole frame of screen image, and reduces the data processing amount of the screen projection controller to a certain extent.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the method for customizing a control key and the method for starting the control key provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), or the like.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method in the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application. The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.