CROSS-REFERENCE TO RELATED APPLICATION(S)This application claims the benefit under 35 U.S.C. §119(e) of a U.S. Provisional application filed on Sep. 10, 2012 in the U.S. Patent and Trademark Office and assigned Ser. Nos. 61/698,909 and 61/698,985 and under 35 U.S.C. §119(a) of a Korean patent application filed on Sep. 20, 2012 and Jan. 11, 2013 in the Korean Intellectual Property Office and assigned Serial Nos. 10-2012-0104794, 10-2012-0104823 and 10-2013-0003465, the entire disclosures which are hereby incorporated by reference.
TECHNICAL FIELDThe present disclosure relates to a function execution method and apparatus of a mobile terminal. More particularly, the present disclosure relates to a method and apparatus for managing a connection between a mobile terminal and an external display apparatus.
BACKGROUNDThe present disclosure is applicable to Orthogonal Frequency Division Multiple Access (OFDMA) system such as 3rdGeneration Partnership Project Long Term Evolution (3GPP LTE) and other similar systems.
With the advance of digital technologies, various types of mobile terminals capable of communicating and processing data (e.g., cellular communication terminal, Personal Digital Assistant (PDA), electronic organizer, smartphone, tablet Personal Computer (PC), and the like) are emerging. Recently, the mobile terminals are evolving into multifunctional devices integrating various functions in line with the mobile convergence tendency. For example, a recent mobile terminal integrates various functions including voice and video telephony function, a messaging function including Short Message Service (SMS) Multimedia Message Service (MMS), and email, a navigation function, a document editing (e.g., memo and word processor) function, a picture capture function, a broadcast playback function, a multimedia (e.g., video and audio) playback function, an Internet access function, a messenger function, a Social Networking Service (SNS) function, and/or the like.
The recent mobile terminal supports the external output function capable of connecting to an external display device (e.g., Liquid Crystal Display (LCD) monitor) to display the data of the mobile terminal thereon. If the external display device is connected to the mobile terminal, the mobile terminal transfers the terminal's screen to the external display device (e.g., when the mobile terminal operates in clone mode). In addition, the mobile terminal may transfer only the execution screen (e.g., video data according to the motion picture playback) of a certain function (or application) which is currently running on the mobile terminal (e.g., when the mobile terminal operates in video only mode).
However, the aforementioned methods have a drawback in that when an external display device is connected to the mobile terminal, efficiently utilizing the screens different in size is difficult. For example, the methods according to the related art are limited in function to the clone mode and video only mode.
In addition, the methods according to the related art support only one layout related to the screen displayed by the mobile terminal such that the external display device displays the screen in the same layout as the mobile terminal. Accordingly, the method according to the related art merely enlarges the same layout to fit for the screen size of the external display device. As described above, the method according to the related art has a drawback in that when the mobile terminal is connected to an external display device, efficiently using the displays different in size is difficult.
In addition, the dual display system according to the related art has a drawback in cost due to the requirement of extra external input device. Of course, the external input device may be replaced by the input unit of the mobile terminal. However, in the case that the input unit of the mobile terminal is a touchscreen, using the touchscreen for controlling the external display device is difficult.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
SUMMARYAspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a desktop virtualization method and apparatus of a mobile terminal that is capable of operation in the desktop environment through the mobile terminal interoperating with an external display device.
An aspect of the present disclosure is to provide a desktop virtualization method and apparatus of a mobile terminal that is capable of implementing a desktop window environment through an external display device in a single system of the mobile terminal.
Another aspect of the present disclosure is to provide a desktop virtualization method and apparatus of a mobile terminal that is capable of supporting large amount of information and various user experiences for the user.
Another aspect of the present disclosure is to provide an intelligent service provision method and apparatus that is capable of improving user convenience and mobile terminal usability by implementing optimal environment for supporting the desktop window environment in the mobile terminal.
Another aspect of the present disclosure is to provide per-display device window layout provision method and apparatus of a mobile terminal that is capable of displaying distinct screens with different window layouts.
Another aspect of the present disclosure is to provide a per-display device window layout provision method and apparatus of a mobile terminal that is capable of supporting different window layout screens for the distinct display devices.
Another aspect of the present disclosure is to provide a per-display device window layout provision method and apparatus of a mobile terminal that is capable of supporting screen displays with different window layouts for distinct display devices according to the display device outputting an execution screen of an application running on the mobile terminal.
Another aspect of the present disclosure is to provide a per-display device window layout provision method and apparatus of a mobile terminal that is capable of providing the user with various window layouts corresponding to different display devices and supporting display of various information according to the screen sizes of the display devices.
Another aspect of the present disclosure is to provide a per-display device window layout provision method and apparatus of a mobile terminal that is capable of improving user convenience and mobile terminal usability by implementing optimal environment for supporting screen displays most appropriate for the respective displays especially when an application is running on the mobile terminal.
Another aspect of the present disclosure is to provide a method for providing a mobile terminal user with a dual display mode using an external display device without extra external input device.
Another aspect of the present disclosure is to provide a method for providing a mobile terminal user with a dual display mode using a touch screen of the mobile terminal and an external display device.
In accordance with an aspect of the present disclosure, a function control method using a mobile terminal and an external display device is provided. The function control method includes detecting a connection of the external display device, outputting a default desktop window screen to the external display device, receiving a control input, controlling, when the control input is an external input for controlling a desktop region of the external display device, a screen display of the desktop region, and controlling, when the control input is an internal input for controlling a default region of the mobile terminal, a screen display of the default region.
In accordance with another aspect of the present disclosure, a function control method using a mobile terminal and an external display device is provided. The function control method includes detecting a connection between the mobile terminal and the external display device, executing a desktop mode, displaying a default desktop window screen for the desktop mode through the external display device, and controlling a desktop virtualization function in a desktop window environment through the external display device.
In accordance with another aspect of the present disclosure, a mobile terminal is provided. The mobile terminal includes an interface unit configured to provide at least one of wired and wireless interface for connection of an external display device and an external input device, a touchscreen configured to display a window screen having a layout for a default region of the mobile terminal and a virtual input device for desktop environmental control of a desktop region of the external display device and to receive a control input using the virtual input device, and a controller configured to control desktop virtualization for displaying, when interoperating with the external display device, data and application generated in the mobile terminal in correspondence to the desktop environment through the external display apparatus and processing operations.
In accordance with another aspect of the present disclosure, a non-transitory computer-readable storage medium is provided. The non-transitory computer-readable storage medium is recorded with a program that, when executed, causes at least one processor to perform a method including executing, when a mobile terminal is connected to an external display device, a desktop mode, displaying a default desktop window screen for the desktop mode through the external display device, and controlling a desktop virtualization function in a desktop window environment through the external display device.
In accordance with another aspect of the present disclosure, a screen display method using a mobile terminal and an external display device is provided. The screen display method includes detecting a window creation event occurring with execution of an application, determining a display device for displaying a window in response to the window creation event, acquiring resource of the display device for displaying the window, determining a window layout policy of the display device corresponding to the acquired resource, creating the window on the display device in a layout according to the determined window layout policy, and displaying an execution screen of the application through the window.
In accordance with another aspect of the present disclosure, a screen display method using a mobile terminal and an external display device is provided. The screen display method includes acquiring, at the mobile terminal when the external device is connected, resource of the external display device, determining a window layout policy corresponding to the acquired resource, outputting a desktop window screen determined according to the window layout policy to the external display device, displaying, at the external display device, the desktop window screen according to the window layout policy, determining, at the mobile terminal when a user input for execution an application is detected, a window creation region for the application, determining, when the window creation region is the external display device, the window layout policy for the external display device, outputting an application execution screen according to the determined window layout policy to the external display device, and displaying, at the external display device, the application execution screen in the layout according to the window layout policy.
In accordance with another aspect of the present disclosure, a method for providing a window layout per display device is provided. The method includes configuring window layout policies for display devices, determining, when a window is created in response to an application execution request, the window layout policy to be applied to the window, and creating the window in a layout according to the determined window layout policy.
In accordance with another aspect of the present disclosure, a system for displaying a screen according to per-display device layout policies is provided. The system includes a mobile terminal which supports multi-screen function in connection with an external display device and which outputs a window formatted according to the window layout policy corresponding to a resource of the connected external display device and an external display device which displays the window output by the mobile terminal in the layout according to the window layout policy.
In accordance with another aspect of the present disclosure, a mobile terminal is provided. The mobile terminal includes an interface unit configured to provide at least one of wired and wireless interfaces for connection of an external display device, a display unit configured to display a screen according to a preconfigured window layout policy, and a controller configured to configure window layout policies of display devices, to determine, when creating a window in response to an application execution request, the window layout policy to be applied to the window according to the preconfigured window layout policy, to create the window in a layout according to the window layout policy, and to control the external display device to display the window.
In accordance with another aspect of the present disclosure, a non-transitory computer-readable storage medium is provided. The non-transitory computer-readable storage medium is recorded with a program that, when executed, causes at least one processor to perform a method including detecting a window creation event occurring with execution of an application, determining a display device for displaying a window in response to the window creation event, acquiring resource of the display device for displaying the window, determining a window layout policy of the display device corresponding to the acquired resource, creating the window on the display device in a layout according to the determined window layout policy, and displaying an execution screen of the application through the window.
In accordance with another aspect of the present disclosure, a method for processing a user input made on a mobile terminal equipped with a touchscreen and connected to an external display device is provided. The method includes detecting the user input on the touchscreen, determining whether the user input is related to a first screen of the external display device, and performing, when the user input is related to the first screen, a function related to the first screen in response to the user input and, when the user input is not related to the first screen, a function related to a second screen of the touchscreen in response to the user input.
In accordance with still another aspect of the present disclosure, a mobile terminal is provided. The mobile terminal includes an interface unit configured to connect to an external display device through a wired link, a radio communication unit configured to connect to the external display device through a wireless link, a touchscreen including a touch panel and a display panel, and a controller configured to control the interface unit, the radio communication unit, and the touchscreen, wherein the controller detects a user input, determines whether the user input is related to a first screen of the external display device, and performs, when the user input is related to the first screen, a function related to the first screen in response to the user input and, when the user input is not related to the first screen, a function related to a second screen of the touchscreen in response to the user input.
Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGSThe above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic diagram illustrating a configuration of a mobile terminal according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating connections of external devices to a mobile terminal according to an embodiment of the present disclosure;
FIG. 3 is a signal flow diagram illustrating interoperation between a mobile terminal and an external display device in a display system according to an embodiment of the present disclosure;
FIGS. 4 and 5 are diagrams illustrating displays of a mobile terminal and an external display device according to an embodiment of the present disclosure;
FIGS. 6 and 7 are diagrams illustrating displays of a mobile terminal and an external display device for explaining an input operation according to an embodiment of the present disclosure;
FIGS. 8,9,10,11, and12 are diagrams illustrating screen displays of a mobile terminal and an external device for explaining a desktop virtualization operation according to an embodiment of the present disclosure;
FIG. 13 is a flowchart illustrating a desktop virtualization method of a mobile terminal according to an embodiment of the present disclosure;
FIG. 14 is a flowchart illustrating a desktop virtualization method of a mobile terminal according to an embodiment of the present disclosure;
FIG. 15 is a flowchart illustrating a desktop virtualization method of a mobile terminal according to an embodiment of the present disclosure;
FIG. 16 is a flowchart illustrating a per-display device window layout provision method of a mobile terminal according to an embodiment of the present disclosure;
FIG. 17 is a flowchart illustrating a per-display device window layout management method of a mobile terminal according to an embodiment of the present disclosure;
FIG. 18 is a signal flow diagram illustrating signal flows between a mobile terminal and an external display device interoperating in a method according to an embodiment of the present disclosure;
FIGS.19A,19B,19C, and19D are diagrams illustrating screen displays of a mobile terminal and an external display device for explaining an interoperation there between according to an embodiment of the present disclosure; and
FIG. 20 is a flowchart illustrating a method for processing a user input according to an embodiment of the present disclosure;
FIG. 21 is a diagram illustrating software architecture for explaining a procedure of moving a pointer on an external display device according to an embodiment of the present disclosure;
FIG. 22 is a diagram illustrating screen displays of a mobile terminal and an external display device for explaining a procedure of moving a pointer on an external display device according to an embodiment of the present disclosure;
FIG. 23 is a diagram illustrating software architecture for explaining a procedure of changing a size of an image displayed on an external display device according to an embodiment of the present disclosure;
FIG. 24 is a diagram illustrating screen displays of a mobile terminal and an external display device for explaining a procedure of changing a size of an image displayed on an external display device according to an embodiment of the present disclosure.
The same reference numerals are used to represent the same elements throughout the drawings.
DETAILED DESCRIPTIONThe following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure is provided for illustration purpose only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
The present disclosure relates to a method and apparatus for implementing desktop virtualization of a mobile terminal. According to various embodiments of the present disclosure, the desktop virtualization is capable of allowing the mobile terminal interoperating an external display device to render the data and application processed thereby to fit for the desktop window environment corresponding to the external display device.
In the following description, the mobile terminal may be any of all the types of information communication devices, multimedia devices, and equivalents thereof using an Application Process (AP), a Graphic Processing Unit (GPU), and a Central Processing Unit (CPU). For example, the mobile terminal may be any of cellular communication terminal operating with various communication protocols corresponding to the communication systems, a tablet Personal Computer (PC), a smartphone, a digital camera, a Portable Multimedia Player (PMP), a Media Player (e.g., MP3 player), a handheld e-book, a portable game console, a Personal Digital Assistant (PDA), and the like. In addition, according to various embodiments of the present disclosure, the gesture-based control method may be applied to various display devices such as a digital television (TV), Digital Signage (DS), and a Large Format Display (LFD), a laptop computer, and a desktop computer, and/or the like.
Various embodiments of the present disclosure may be implemented with the mobile terminal supporting connection to an external display device and the external device displaying a desktop window environment. Various embodiments of the present disclosure may further include an input device (e.g., a wired/wireless keyboard, an external mouse, and/or the like) capable of manipulating windows presented on the external display device connected to the mobile terminal.
Various embodiments of the present disclosure relate to a per-display device window layout provision method and apparatus of a mobile terminal that are capable of displaying screens having different layouts for respective display devices. Particularly according to various embodiments of the present disclosure, when at least one external display device is connected to the mobile terminal, the execution screen of the same application is displayed in window different layouts for the mobile terminal's display unit and the external display device.
According to various embodiments of the present disclosure, differentiating among display devices based on the respective resources thereof (e.g., device type and screen size) is possible and configuring a window layout policy per display device is possible. According to various embodiments of the present disclosure, the window layout policy may be the information for determining the window layout of the execution screen depending on the resource (particularly, screen size) of the display device.
The display method according to various embodiments of the present disclosure may be implemented with a mobile terminal supporting multi-screen function and connection of external display device and an external display device for displaying the execution screen of an application running on the mobile terminal according to the window layout policy determined by the mobile terminal. Various embodiments of the present disclosure may include configuring a window layout policy per display device, determining a window layout per display device, and applying a window layout per display device according to the window layout policy.
For example, according to various embodiments of the present disclosure, the display method may distinguish among the display devices according to the resources of the display devices and configures window layout policy per display device regardless of the connection of the external display device to the mobile terminal. The window layout policy may be provided by a provider as default or defined diversely according to the user configuration. When a window is generated according to the execution of an application, determining the area of the display device for generating a window and determining the window layout policy of the display device on which the window is generated from the preconfigured per-display device window layout policy are possible. If the window layout policy for the display device exists, the corresponding window layout policy is applied and, otherwise no window layout policy for the display device exists, the default window layout policy is applied.
Hereinafter, a description is made of the configuration of the mobile terminal and control method thereof according to various embodiments of the present disclosure. The configuration of the mobile terminal and control method thereof according to various embodiments of the present disclosure is not limited to the following description but may be embodied with various modifications.
FIG. 1 is a schematic diagram illustrating a configuration of a mobile terminal according to an embodiment of the present disclosure.
Referring toFIG. 1, themobile terminal100 includes aradio communication unit110, auser input unit120, adisplay unit130, anaudio processing unit140, astorage unit150, aninterface unit160, acontroller170, and apower supply180. According to various embodiments of the present disclosure, themobile terminal100 may be implemented with or without any of the components shown inFIG. 1. If themobile terminal100 supports image capturing function, themobile terminal100 may further include a camera module. If themobile terminal100 does not support the mobile communication function, a part of the radio communication unit110 (e.g., cellular communication module) may be omitted.
Theradio communication unit110 may include at least one communication module capable of allowing themobile terminal100 to communicate with a radio communication system or a network to which another device is connected. For example, theradio communication unit110 may include at least one of acellular communication module111, a Wireless Local Area Network (WLAN)module113, a shortrange communication module115, alocation positioning module117, and abroadcast reception module119.
Themobile communication module111 communicates radio signals with at least one of a base station, an external terminal, and various servers (e.g., integration server, a provider server, a content server, and/or the like). The radio signals may include various types of data for voice telephony, video conference, or text/multimedia messaging services. Thecellular communication module111 connects to at least one of the various servers to receive an application supporting at least one of mobile window environment and desktop window environment under the control of thecontroller170.
According to various embodiments of the present disclosure, the mobile window environment is the environment in which the function (or application) execution screen is rendered to be fit for the window corresponding to the screen size of thedisplay unit130 of themobile terminal100. The desktop window environment is the environment in which the function (or application) execution screen of themobile terminal100 is rendered variously in size according to the screen size of the external display device for desktop environment operations.
Meanwhile, thecellular communication module111 may connect to at least one of the various servers to receive the window layout policy for various display devices under the control of thecontroller170.
TheWLAN module113 is the module for connecting wireless Internet and establishing a WLAN link with another mobile terminal. According to various embodiments of the present disclosure, theWLAN module113 may be embedded in themobile terminal100 or attached as an external device. TheWLAN module113 may support at least one wireless internet technology among Wi-Fi, Wireless Broadband (WiBro), World Interoperability for Microwave Access (WiMAX), high Speed Downlink Packet Access (HSDPA), and the like. TheWLAN module113 may connect to at least one of the various servers to receive the application supporting at least one of the mobile window environment and desktop window environment according to the user's selection. When a WLAN link is established with another mobile terminal, theWLAN module113 may transmit or receive the application selected by the user to and from the other mobile terminal.
The WLAN module connects to at least one of the various servers to receive the window layout policy for various display devices under the control of thecontroller170. If a WLAN link is established with another mobile terminal, theWLAN module113 may transmit or receive the per-display device window layout policies to and from the other mobile terminal according to the user's selection.
The shortrange communication module115 is the module for short range communication. Examples of short range communication technologies include Bluetooth, Bluetooth Low Energy (BLE), Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), and the like. The shortrange communication module115 may establish a short range communication link with another mobile terminal to transmit or receive data to and from the other mobile terminal. According to various embodiments of the present disclosure, the shortrange communication module115 may support connection of an external input device (e.g., Bluetooth, keyboard, mouse, and/or the like). When a short range communication link is established with another mobile terminal, the shortrange communication module115 may transmit or receive per-display device window layouts to and from the other mobile terminal.
Thelocation positioning module117 is responsible for positioning the location of the user device, and Global Positioning System (GPS) module is one of the representative location positioning modules. Thelocation positioning module115 collects accurate distance and time information from at least three base stations and performs triangulation based on the acquired information to acquire 3-Dimensional (3D) location information with latitude, longitude, and altitude. Thelocation positioning module115 is also capable of calculating the location information based on the signals from three or more satellites in real time. The location information of the user device may be acquired using various methods.
Thebroadcast reception module119 receives broadcast signals (e.g., TV broadcast signal, radio broadcast signal, and data broadcast signal) and/or information on the broadcast (e.g., broadcast channel information, broadcast program information, and broadcast service provider information) from an external broadcast management server through a broadcast channel (e.g., satellite broadcast channel, and terrestrial broadcast channel).
Theinput unit120 generates an input signal for controlling the operation of the user device in response to the user input. Theinput unit120 may include a key pad, a dome switch, a touch pad (capacitive/resistive), a jog wheel, a jog switch, and/or the like. Theinput unit120 may be implemented with external buttons and/or virtual button on the touch panel.
According to various embodiments of the present disclosure, theinput unit120 may include a plurality of keys for receiving alphanumeric information and configuring various functions. The plurality of keys may include a menu key, a screen on/off key, a power on/off key, a volume control key, and/or the like. Theinput unit120 generates key signal corresponding to the user configuration and function control key input to thecontroller170. The key signal may include power on/off signal, volume control signal, and screen on/off signal. Thecontroller170 controls the components in response to the key signal. The keys of theinput unit120 are referred to as hard keys and the virtual keys presented on thedisplay unit130 are referred to as soft keys.
Thedisplay unit130 displays (outputs) the information processed by the user device. For example, in the case that the user device is operating in a telephony mode, thedisplay unit130 displays a telephony User Interface (UI) or Graphic UI (GUI). In the case that the user device is operating in a video telephony mode or a picture shooting mode, thedisplay unit130 displays a UI or GUI displaying the picture taken by the camera or received through the communication channel. Particularly, thedisplay unit130 displays the execution screens of various functions (or applications) running on themobile terminal100. Thedisplay unit130 presents a virtual input device (e.g., virtual touch pad) and generates a signal corresponding to the input made by means of the virtual input device to thecontroller170. For example, thedisplay unit130 displays various execution screens of themobile terminal100 and may present the virtual input device screen according to the user's selection in the state that the execution screen is displayed.
Thedisplay unit130 displays execution screens of various functions (or applications) running on themobile terminal100 according to the window layout policy. Thedisplay unit130 is also capable of supporting display mode switching function for switching between portrait mode and landscape mode according to the rotation direction (or orientation) of the user device. The operation of the display panel131 is described later with reference to screens.
Thedisplay unit130 may be implemented with any of a Liquid Crystal Display (LCD), a Thin Film Transistor LCD (TFT LCD), a Light Emitting Diode (LED), an Organic LED (OLED), an Active Matrix OLED (AMOLED), a flexible display, a bended display, a 3-Dimensional (3D) display, and/or the like. Thedisplay unit130 may be implemented as a transparent display panel through which the light penetrates.
In the case that thedisplay unit130 and the touch panel detecting touch gesture are layered (hereinafter, referred to as touchscreen), thedisplay unit130 may work as an input device as well as output device. The touch panel may be configured to convert change in pressure or capacitance at a specific region of thedisplay unit130 to an electric signal. The touch panel may be configured to detect the pressure of the touch as well as the touched position and size. If touch gesture is made on the touch panel, the corresponding signal(s) is generated to a touch controller (not shown). The touch controller (not shown) processes the signal(s) and generates corresponding data to thecontroller170. Thecontroller170 is aware of the touched region.
As described above, thedisplay unit130 includes the touch panel and a display panel. The touch panel may be placed on the display panel of display unit. In detail, the touch panel may be implemented in add-on type in which the touch panel is placed on the display panel or in on-cell type or in-cell type in which the touch panel is embedded into display panel.
The touch panel generates an analog signal (e.g., touch event) in response to the user's touch gesture made on thedisplay unit130 and converts the analog signal to generate a digital to thecontroller170. The touch event includes touch coordinates (x, y). For example, the touch panel determines some of the coordinates within the touch area (e.g., area touched by a user's finger or a pen) as the touch coordinates and sends the touch coordinates to thecontroller170. The touch coordinates may be indicated by pixel. For example, if the screen resolution of thedisplay unit130 is 640*480, the X axis coordinate is in the range of 0 to 640 and the Y axis coordinate is in the range of 0 to 480. The touch panel may generate the touch signal including coordinates of the touched area to thecontroller170. In this case, thecontroller170 determines some of the received coordinates as the touch coordinates.
If the touch coordinates are received from the touch panel, thecontroller170 determines that a touch tool (e.g., finger or pen) is in contact with the touch panel and, if no touch coordinate is received, that the touch has been released. In addition, if the coordinates are changed, (e.g., from (x1, y1) to (x2, y2)), thecontroller170 determines that the touch has moved. In response to the touch movement, the controller calculates the displacement (dx, dy) and movement speed of the touch. Thecontroller170 determines the user gesture as one of touch, double touch, tap, double tap, long tap, tap & touch, drag, flick, press, pinch-in, and pinch-out, and/or the like based on the touch coordinates, whether the touch is applied or released, whether the touch moves, touch displacement, touch movement speed, and/or the like. The ‘touch’ is a user's gesture contacting a touch tool at a position on the touchscreen, the ‘multi-touch’ is the gesture of contacting the touch tool at least to positions (e.g., with thumb and index finger) on the touch screen, the ‘tap’ is the gesture of contacting a position on the screen with a touch tool and releasing the contact (touch-off) without moving the touch tool, the ‘double tap’ is the gesture of making the tap twice, the ‘long tap’ is the gesture of maintaining the contact for a long time as compared to the tab and then releasing the contact, the ‘tap and touch’ is the gesture of making a tap to a certain position and then making a touch at the same position in a predetermined time (e.g., 0.5 second), the ‘drag’ is the gesture of contacting a position and moving the contact on the screen in a certain direction, the ‘flick’ is a user's gesture of snapping on the screen quickly as compared to the drag gesture, the ‘press’ is a user's gesture of contacting at a certain position on the screen and applying press, the ‘pinch-in’ is the gesture of making two contact points with two touch tools and narrowing the distance between the two contact points, and the ‘pinch-out’ is the gesture of widening the distance between two contact points. For example, the ‘touch’ means the state of contacting the touch panel, and other gestures are changes in touched state. The touchscreen may be provided with a pressure sensor to detect the pressure at the touched position. The detected pressure information is sent to thecontroller170 such that thecontroller170 distinguishes between the touch and pressure based on the pressure information.
The touch panel may be implemented as a combined touch panel including a finger touch panel for detecting a gesture made by human body and a pen touch panel for detecting pen gesture made by a pen. The finger touch panel may be implemented as a capacitive type panel. The finger touch panel is capable of detecting the touch gesture made by a certain object (e.g., conductive material capable of changing electrostatic capacity) as well as human body. The pen touch panel may be implemented with an electromagnetic induction type panel. In this case, the pen touch panel generates a touch event in response to the gesture made by the touch stylus pen manufactured to generate magnetic field.
The display panel displays the image under the control of thecontroller170. Thecontroller170 renders the data into an image and stores the image in a buffer. The display panel may display various images associated with the operation of the mobile terminal100 (e.g., lock screen, home screen, application (or app) execution screen, keypad, and/or the like). If the mobile terminal powers on, thedisplay unit130 displays the lock screen. If a touch input or key input for unlock is detected, the controller changes the lock screen for the home screen or application execution screen. The home screen may include a background image and a plurality of icons arranged thereon. The icons correspond to the respective applications. If an icon is selected by the user (e.g., tap on an icon), the controller executes the corresponding application (e.g., browser) and displays the execution screen by means of thedisplay unit130. Thedisplay unit130 may display the application execution screen as background and another image (e.g., keypad), on the foreground so as to be overlaid on the background. Thedisplay unit130 displays a first image at the first region and the second image at the second region.
Theaudio processing unit140 sends the audio signal received from thecontroller170 to the speaker (SPK)141 and sends the audio signal such as voice input through the microphone (MIC)143 to thecontroller170. Theaudio processing unit140 is capable of processing the voice/sound data to output an audible sound wave through thespeaker141 and processing the audio signal including voice to generate a digital signal to thecontroller170.
Thespeaker141 is capable of outputting audio received by theradio communication unit110 or stored in thestorage unit150 in the telephony mode, audio (video) recording mode, media content playback mode, broadcast reception mode, photo capture mode, and/or the like. Thespeaker141 is also capable of output sound effects associated with the function executed in the mobile terminal (e.g., inbound call reception, outbound call placing, audio and video playback, photo shooting, and external output).
Themicrophone143 is capable of processing the input acoustic signal to generate voice data in the telephony mode, audio (video) recording mode, speech recognition mode, photo capture mode, and/or the like. The processed voice data may be processed into the signal to be transmitted to the base station by means of thecellular communication module111 in the telephony mode. Themicrophone143 may be implemented with various noise cancellation algorithms to cancel the noise occurring in the audio signal input process.
Thestorage unit150 may store programs associated with the processes and controls of thecontroller170 and stores input/output data (e.g., phone number, message, multimedia contents including audio and video files, and applications) temporarily. Thestorage unit150 may store the mobile terminal's function usage frequencies (e.g., application usage frequency, multimedia playback frequency, and phone number usage frequency, messaging frequency, and multimedia usage frequency, weight, priority, and preference). Thestorage unit150 also may store the data associated with the various vibration patterns and sounds corresponding to the touch gestures made on the touchscreen. According to various embodiments of the present disclosure, thestorage unit150 may store various types of application for the desktop virtualization function, and the applications may be classified into the ones supporting the mobile window environment and the ones supporting desktop window environment. For example, certain applications performing the same operation may include a default mode application for outputting mobile window environment screen and a desktop mode application for outputting desktop window environment screen.
Thestorage unit150 also stores the window layout policies for various types of display devices to provide window layout per display device. According to various embodiments of the present disclosure, the window layout policies may be mapped to the display devices by resource and stored in the form of a mapping table.
Thestorage unit150 stores the booting program, Operating System (OS), middleware, and a virtual controller. According to various embodiments of the present disclosure, the virtual controller requests the OS to create a virtual touchscreen and controls thedisplay unit130 to display the execution screen corresponding to the virtual touchscreen. The kernel of the OS includes a U-input module. According to various embodiments of the present disclosure, the U-input module generates the virtual touchscreen in response to the request of the virtual controller, receives a touch event from the virtual controller, and sends the touch event to the virtual touchscreen. The middleware relay data between the OS and applications or between different types of applications. According to various embodiments of the present disclosure, an Xserver is a windows OS-based middleware to receive the user input (e.g., touch position displacement (dx, dy)) and transfers the received user input to the application associated with the external display device. Thereafter, the application performs a function corresponding to the user input (e.g., moves the pointer in proportion to the position displacement on the external display device).
Thestorage unit130 is also capable of storing embedded applications and third party application. The embedded applications are the applications installed in the apparatus basically. The embedded applications may include browser, email, instant messenger, and the like. The third party applications are diverse applications that may be downloaded from the online market and installed in the terminal. The third party applications may be installed and uninstalled freely. If the mobile terminal powers on, the booting program is loaded on the main memory device (e.g., RAM) of thecontroller170. The booting program loads the OS and middleware of the mobile terminal onto the main memory device. The OS loads the applications on the main memory device to execute.
Thestorage unit150 may be implemented with a storage medium of at least one of a flash memory type, a hard disk type, a micro type, a card type (e.g., a Secure Digital (SD) type and an eXtream Digital (XD) card type) memories, a Random Access Memory (RAM), a Dynamic RAM (DRAM), a Static RAM (SRAM), a Read-Only Memory (ROM), a Programmable ROM (PROM), an Electrically Erasable PROM (EEPROM), a Magnetic RAM (MRAM), a magnetic disk, an optical disk type memories, and the like. The user device may interoperate with a web storage working as thestorage unit150 on the Internet.
Theinterface unit160 provides the interface for the external devices connectable to the user device. Theinterface unit160 is capable of transferring the data or power from the external devices to the internal components of the user device and transferring the internal data to the external devices. For example, theinterface unit160 may be provided with wired/wireless headset port, external charging port, wired/wireless data port, memory card slot, identity module slot, audio input/output port, video input/output port, earphone jack, and/or the like. According to various embodiments of the present disclosure, theinterface unit160 includes a data port for connection with at least one external display device through wired or wireless link. For example, theuser interface160 may include High-Definition Multimedia Interface (HDMI) (including standard, mini, and micro HDMIs) and/or Universal Serial Bus (USB) interface. When themobile terminal100 and the external display device are connected through WLAN (Wi-Fi) link, the WLAN may be included in theinterface unit160. When themobile terminal100 and the external input device are connected through a Bluetooth link, the Bluetooth may be included in theinterface unit160.
Thecontroller170 controls overall operations of themobile terminal100. For example, thecontroller170 controls the operations associated with voice telephony, data communication, video conference, desktop virtualization, and external output. Thecontroller170 may include a multimedia module (not shown) for the desktop virtualization and external output. According to various embodiments of the present disclosure, the multimedia module (not shown) may be embedded in thecontroller170 or implemented in separation from thecontroller170.
According to various embodiments of the present disclosure, thecontroller170 controls the operations for supporting the desktop virtualization. For example, thecontroller170 controls the desktop virtualization in which the data generated at themobile terminal100 in the state of interoperation with theexternal display device200 and the application are processed to be fit for the desktop environment of theexternal display device200. For example, thecontroller170 executes, when themobile terminal100 and theexternal display device200 are connected, the desktop mode and controls such that the default desktop window screen for the desktop mode on theexternal display device200. Thecontroller170 controls the window screen for the mobile terminal and the window screen for the external display device with different layouts in control of the desktop virtualization.
When theexternal display device200 is connected, thecontroller170 outputs the default desktop window screen to theexternal display device200 and monitors to detect an input of control single in the desktop mode. If a control signal for controlling the desktop area of the external display device is input from outside, thecontroller170 controls such that the desktop area is displayed on the screen. If the control signal is an internal signal for controlling the default area of the mobile terminal, thecontroller170 controls the default area screen display.
When theexternal display device200 is connected, thecontroller170 may acquire the information on the screen size of theexternal display device200. The information on the screen size of the external display device may be acquired in the form of polling of themobile terminal100 or push of theexternal device200.
Thecontroller170 controls displaying of the window screen with a desktop environment layout through theexternal display device200 and providing window size adjustment button(s) at the top frame of the window displayed on the desktop area of theexternal display device200.
According to various embodiments of the present disclosure, thecontroller170 controls the operations for supporting the functions. For example, thecontroller170 controls such that the data and application of themobile terminal100 is displayed through the connectedexternal display device200 in the layout corresponding to the window layout policy configured through theexternal display device200. Thecontroller170 controls the window screen displayed on theexternal display device200 in separation from the window screen displayed on themobile terminal100.
According to various embodiments of the present disclosure, thecontroller170 controls the operation associated with the configuration of the window layout policies for the display devices. When generating a window according to an application execution request, thecontroller170 may determine the layout to be applied to the window according to a predetermined window layout policy. Thecontroller170 generates the window in the layout determined according to the window layout policy to theexternal display device200.
Thecontroller170 may acquire the resources for the display devices and search for the resource per display device and window layout policy mapped to the resource when determining the window. Particularly, thecontroller170 may acquire the resource for theexternal display device200 in the way of polling of themobile terminal100 or push of the external display device.
According to various embodiments of the present disclosure, thecontroller170 may include a main memory device for storing application programs and OS, cache memory for storing data to be written to or read from thestorage unit150, a CPU, and a GPU. The OS is responsible for interfacing between the hardware and programs and manages the computing resources such as CPU, GPU, main memory device, and auxiliary device. For example, the OS operates themobile terminal100, schedules tasks, and controls operations of the CPU and GPU. The OS is also responsible for controlling execution of application programs and managing data and file storage. As well-known in the art, the CPU may be the main control unit of the computer system which is responsible for data operation and comparison and command analysis and execution. The GPU is the graphic control unit for performing data operation and comparison and command analysis and execution in association with graphics instead of the CPU. The CPU and GPU may be integrated into a package as a single integrated circuit composed of two or more independent cores (e.g., quad-core). The CPU and GPU may be packaged in multi-layered manner. The integrated CPU and GPU may be referred to as Application Processor (AP).
The control operations of thecontroller170 are described later in detail with reference to the accompanying drawings.
Thepower supply180 supplies power of the internal or external power source to the components under the control of thecontroller170.
The various embodiments of the present disclosure may be implemented in hardware, firmware, software, or any combination thereof so as to be stored in a computer or similar device-readable storage medium. In the case of implementing the present disclosure by hardware, the present disclosure can be implemented with Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), a processor, a controller, a microcontroller, a microprocessor, and/or the like. In any case, the disclosed various embodiments of the present disclosure can be implemented on thecontrol unit180. In the case of implementing various embodiments of the present disclosure by software, the procedures functions disclosed in the specification can be implemented with separate software modules. The software modules are capable of performing at least one function and operation described in the specification.
The storage medium may be a non-transitory computer-readable storage medium recording the program capable of executing a desktop mode for theexternal display device200 connected to themobile terminal100, displaying default desktop window screen on theexternal display device200 in the desktop mode, and executing the desktop virtualization function based on the desktop window environment through theexternal display device200.
The storage medium may be a non-transitory computer-readable medium recording the program capable of detecting a window creation event in accordance with the execution of an application, determining the display device for displaying the window in response to the window creation event, acquiring the resource for the display device, determining the window layout policy of the display device, generating the window on the display device in the layout according to the determined window layout policy, and displaying the execution screen of the application through the window.
According to various embodiments of the present disclosure, themobile terminal100 may be any of cellular communication terminal operating with various communication protocols corresponding to the communication systems, a tablet Personal Computer (PC), a smartphone, a digital camera, a Portable Multimedia Player (PMP), a Media Player (e.g., MP3 player), a handheld e-book, a portable game console, a Personal Digital Assistant (PDA), and the like. In addition, the gesture-based control method according to one of various embodiments of the present disclosure may be applied to various display devices such as a digital television (TV), Digital Signage (DS), and a Large Format Display (LFD).
Although it is difficult to enumerate all of the functional components that can be converged in various manners according to the trend of digital convergence, themobile terminal100 may further include an acceleration sensor, a gyro sensor, a GPS module, a Near Field Communication (NFC) module, a vibration module, a camera, an accessory, and/or the like. The accessory may be a detachable part of the mobile terminal such as a pen to make a touch gesture on thedisplay unit130. A certain component of themobile terminal100 may be omitted or replaced by another component according to its implementation.
A description is made of the desktop virtualization according to various embodiments of the present disclosure with reference toFIGS. 2 to 15.
FIG. 2 is a schematic diagram illustrating connections of external devices to a mobile terminal according to an embodiment of the present disclosure.
Referring toFIG. 2, the system of the present disclosure incudes amobile terminal100 supporting multi-screen function in connection with an external display device and theexternal device200 for displaying the screen of themobile terminal100 in a desktop window environment. According to various embodiments of the present disclosure, the system may includeexternal input devices300 connected to themobile terminal100 and capable of manipulating the windows displayed on themobile terminal100 and/or theexternal display device200. InFIG. 2, theexternal display device200 may include a monitor, and theexternal input devices300 may include the keyboard and mouse.
As shown inFIG. 2, themobile terminal100 and theexternal display device200 may connect through a wired interface or a wireless interface. For example, themobile terminal100 and theexternal display device200 may connect to each other through HDMI or Wi-Fi. Themobile terminal100 may output various screen data to theexternal display device200 through the wired interface (e.g., HDMI) or the wireless interface (e.g., Wi-Fi)
Themobile terminal100 and theexternal input devices300 may connect through a wired interface or a wireless interface. For example, themobile terminal100 and theexternal input devices300 may connect through the wired interface such as USB or the wireless interface such as Bluetooth. Themobile terminal100 may receive the control signal input by means of theexternal input devices300 through the wired interface (e.g., USB) or the wireless interface (e.g., Bluetooth). The control signal may be the signal for controlling the window screen displayed on the mobile terminal or the signal for controlling the window screen displayed on theexternal display device200. According to various embodiments of the present disclosure, themobile terminal100 is connected to theexternal display device200 and theexternal input devices300 as shown inFIG. 2 so as to control the window screen displayed on theexternal display device200 by means of theexternal input device300 in the state that the desktop virtualization function is activated in association with theexternal display device200.
If a control signal is received from theexternal input devices300, themobile terminal100 controls the window screen displayed on theexternal display device200 and processes operations for displaying the result on theexternal display device200 in the state of maintaining the window screen of themobile terminal100.
FIG. 3 is a signal flow diagram illustrating interoperation between a mobile terminal and an external display device in a display system according to an embodiment of the present disclosure, andFIGS. 4 and 5 are diagrams illustrating displays of a mobile terminal and an external display device according to an embodiment of the present disclosure.
Referring toFIGS. 3,4, and5, atoperation301 themobile terminal100 is connected to thedisplay device200.
If a connection is established between themobile terminal100 and thedisplay device200 atoperation301, then atoperation303, the desktop mode is executed automatically such that the mobile terminal sends the external display device200 a default desktop window screen. If the desktop mode is executed, themobile terminal100 executes the desktop mode internally and transmits the default desktop window screen to theexternal display device200 in the state of maintaining the current display of themobile terminal100 as shown inFIG. 4. When theexternal display device200 is connected, themobile terminal100 acquires the information on the screen size of theexternal display device200 and determines the size of the default desktop window screen corresponding to the screen size. The information on the screen size of theexternal display device200 may be acquired in the form of polling or push at the time when theexternal device200 is connected.
Ifexternal display device200 connects to themobile terminal100 atoperation301 and, if the default desktop window screen is received from themobile terminal100 atoperation303, then atoperation305, theexternal display device200 displays the default desktop window screen. The default desktop window screen may include desktop shortcut icons250 (or icon list) of certain applications capable of executing operations corresponding to the desktop window environment through apredetermined background screen230 and theexternal display device200.
If a control event for a desktop window environment task occurs, then atoperation307, themobile terminal100 sends the screen corresponding to the control event to theexternal display device200. For example, if the user makes an input for selecting one of thedesktop shortcut icons250 displayed on theexternal display device200 by means of themobile terminal100 or theexternal input device300, themobile terminal100 determines the application represented by the selectedicon250. Themobile terminal100 executes the determined application in the desktop mode and sends the screen created for the desktop window environment (application desktop window screen) to theexternal display device200. For example, when the execution of an application represented by the icon selected on the desktop area is detected, themobile terminal100 executes the application having the layout appropriate for the desktop environment.
If the external display receives the application desktop window screen transmitted by themobile terminal100 atoperation307, then atoperation309, theexternal display device200 displays the application desktop window screen. The application desktop window screen is provided in the form of the application execution screen having the layout for the desktop window environment different from the window screen of themobile terminal100 as shown inFIG. 5. At this time, the screen of themobile terminal100 may be maintained in a current state. The application execution screen includes a status bar at the top of the window frame corresponding to the desktop window environment unlike the mobile application execution screen displayed on themobile terminal100. Thestatus bar550 may display the status information (application name, and/or the like) on the execution screen (or corresponding window). Particularly, thestatus bar550 may include windowsize adjustment buttons555 such as minimization, maximization, and close buttons.
FIGS. 6 and 7 are diagrams illustrating displays of a mobile terminal and an external display device for explaining an input operation according to an embodiment of the present disclosure.
FIG. 6 shows screen displays in the case of using themobile terminal100 as the input means of theexternal display device200 according to an embodiment of the present disclosure, andFIG. 7 shows screen displays in the case of using theexternal input device300 connected to themobile terminal100 as the input means of theexternal display device200 according to an embodiment of the present disclosure.
Referring toFIG. 6, themobile terminal100 may display a virtual input device (e.g., virtual touchpad) according to the user's selection. The user may input a control signal using the virtual input device displayed on themobile terminal100 to make a manipulation such as moving thepointer650 displayed on theexternal display device200, executing application, moving execution window screen, enlarging/shrinking window size, minimizing size, maximizing size, closing window, executing adesktop shortcut icon250, or the like.
According to various embodiments the present disclosure, the virtual input device may provide a windowsize adjustment region630 capable of executing the functions of the buttons corresponding to the minimization button, maximization button, and close button of thestatus bar550 at the top frame of the window displayed on theexternal display device200; atouchpad region650 for detecting a gesture for moving thepointer750 on theexternal display device200, icon selection gesture, single touch gesture, and multi-touch gesture; and anindicator region610 for providing various operation status information of themobile terminal100 in real time. The virtual input device may also provide akeypad execution button670 for executing the virtual touch keypad.
Referring toFIG. 7, themobile terminal100 may be connected with the external input devices300 (e.g., keyboard and mouse) through wireless links such as Bluetooth links or wired links such as USB cables. The user may input a control signal using theexternal input device300 connected to themobile terminal100 to perform the manipulation of such as movement of thepointer750 in the desktop region displayed on theexternal display device200, application execution, executed window movement, window size enlargement/shrink, size minimization, size maximization, window close, executing adesktop shortcut icon250, or the like.
FIGS. 8,9,10,11, and12 are diagrams illustrating screen displays of a mobile terminal and an external device for explaining a desktop virtualization operation according to an embodiment of the present disclosure.
Referring toFIG. 8, a screen display in the state that an application is running in the desktop window environment of theexternal display device200 under the control of themobile terminal100 is illustrated. As shown inFIG. 8, the application execution window provides astatus bar550 having a windowsize adjustment button555 at the tope side of the window frame.FIG. 8 is directed to the case in which theexternal input device300 is connected to themobile terminal100 for manipulating the window displayed on theexternal display device200 such that thepointer750 moves on the desktop region according to the input mode of theexternal input device300.
When an application execution command is input by selecting an icon in the desktop region of theexternal display device200, themobile terminal100 executes the application having the layout for the desktop environment and processes the corresponding screen output. If the application desktop window screen is received from themobile terminal100, theexternal display device200 displays the application desktop window screen. The application desktop window screen is the application execution screen formed in the layout fit for the desktop environment unlike the window screen of themobile terminal100 as shown inFIG. 8. At this time, the screen of themobile terminal100 may be maintained in a current state.
Referring toFIG. 9, a screen display in the state that plural applications are executed in the desktop window environment of theexternal display device200 is illustrated. Particularly, when the plural application execution windows may be provided with astatus bar550 having the windowsize adjustment buttons555 at the top sides of their frames respectively. Thestatus bar550 of each window may provide the status information of the corresponding application.FIG. 9 is directed to the case in which the window displayed on theexternal display device200 is adjusted by means of theexternal input device300 connected to themobile terminal100 and thepointer750 is provided on in the desktop region according to the input mode of theexternal input device300.
Referring toFIG. 10, a screen display in the case of extending the execution window of the application according to the user's manipulation in the state that the application is executed in the desktop window environment through theexternal display device200 is illustrated. The application execution window may be provided with astatus bar550. Thestatus bar550 may further include windowsize adjustment buttons555. As shown inFIG. 10, the user may move thepointer750 to the right most position of the window frame by means of theexternal input device300. When thepointer750 is placed at the edge, thepointer750 is changed in shape as an indicator to indicate that the window size can be extended or shrunk. The user may make a control signal for right direction movement by manipulating theexternal input device300. Then the window is extended as shown inFIG. 10. For example, if the window is extended in the state that a gallery list is displayed in the window according to the execution of the gallery application, the window screen is provided in such a way of including the region presenting detailed information (enlarged photo) on the photos of the gallery list.
Referring toFIG. 11, screen displays in the case of moving an application execution screen according to the user's manipulation in the state that the application is executed in the desktop window through theexternal display device200 under the control of themobile terminal100 are illustrated. As shown inFIG. 11, the user may move thepointer750 at the top side of the window frame (e.g., the status bar550) using theexternal input device300. When thepointer750 is placed at a position on thestatus bar550, thepointer750 may be changed in shape to indicate that the window may be moved. In this state, the user may make a control input for movement in a direction (e.g., rightward) in the display region of theexternal display apparatus200 by manipulating theexternal input device300. Then the window moves in response to the user input as shown inFIG. 11.
Referring toFIG. 12, screen displays in the state that a specific application is executed in the desktop window environment through theexternal display device200 under the control of themobile terminal100 are illustrated. As shown inFIG. 12, the application execution window may be provided with thestatus bar550 having the windowsize adjustment button555. Particularly,FIG. 12 shows the case in which the application executed in the region of themobile terminal100 in the state ofFIG. 8. As shown inFIG. 12, if an application execution event such as incoming call reception in the state that themobile terminal100 is connected to theexternal display device200, the screen of theexternal display device200 displays the window screen (e.g., incoming call reception status screen) on themobile terminal100 according to the execution event at the mobile terminal in the state of maintain the current state. As shown inFIG. 12, as the application is executed in themobile terminal100, thepointer750 may disappear on the current screen of theexternal display device200. This is to indicate intuitively that the current manipulation is performed on the mobile terminals.
If an application execution command input by the selection of the icon on the desktop region of theexternal display device200 is detected, themobile terminal100 executes an application having the desktop environment-friendly layout and processes the screen display. If the application desktop window screen is received from themobile terminal100, theexternal device200 displays the application desktop window screen having the desktop environment-friendly layout. If the application is executed in the region of the mobile terminal, the mobile environment-friendly application screen is displayed as shown inFIG. 12.
AlthoughFIGS. 8 to 12 are directed to the case in which theexternal input device300 is connected to themobile terminal100, the desktop window environment of theexternal display device200 may be manipulated through the virtual input device provided on themobile terminal100 as described above with the omission of theexternal input device300.
FIG. 13 is a flowchart illustrating a desktop virtualization method of a mobile terminal according to an embodiment of the present disclosure.
Referring toFIG. 13, atoperation1301, thecontroller170 determines a current operation mode (e.g., standby mode and application execution mode).
Atoperation1303, thecontroller170 detects a connection of theexternal device200. For example, thecontroller170 may detect the connection of theexternal display device200 through a wired (HDMI) link or a wireless (Wi-Fi) link. If theexternal display device200 is connected, thecontroller170 executes the desktop mode for desktop virtualization and transmits a default desktop window screen to theexternal display device200. If theexternal display device200 is connected, thecontroller170 may determine the type of the connection between themobile terminal100 and theexternal display device200.
When theexternal display device200 is connected to themobile terminal100 atoperation1303, thecontroller170 proceeds tooperation1305 at which thecontroller170 acquires the information on the screen size of theexternal display device200. The information on the screen size of theexternal display device200 is acquired in the way of polling or push at the time when the mobile terminal and theexternal display device200 connect to each other.
Atoperation1307, thecontroller170 determines the size of the default desktop window screen based on the information on the screen size of theexternal display device200.
Atoperation1309, thecontroller170 outputs the default desktop window screen corresponding to the determined screen size to theexternal display device200. Thecontroller170 may process the output of the default desktop window screen through wired (HDMI) or wireless (Wi-Fi) link depending on the connection type of theexternal display device200.
Atoperation1311, thecontroller170 detects a control input.
If a control input is detected in the state that the default desktop window screen for desktop mode operation is displayed on theexternal display device200 atoperation1311, then thecontroller170 proceeds tooperation1313 at which thecontroller170 determines whether the control input is an internal control input for controlling the default region of themobile terminal100 or an external control input for controlling the desktop region of theexternal display device200.
If thecontroller170 determines that the control input is not the external control input (e.g., if the control input is the internal control input), atoperation1313, then thecontroller170 proceeds tooperation1315 at which thecontroller170 controls such that the execution screen is displayed in the mobile terminal-friendly layout in the default region of the mobile terminal while the current screen of theexternal display device200 is maintained. For example, thecontroller170 controls displaying the window screen of the default region of themobile terminal100.
Conversely, if thecontroller170 determines that the control input is the external control input atoperation1313, then thecontroller170 proceeds tooperation1317 at which thecontroller170 controls such that the desktop window screen is displayed on the desktop region of theexternal display device200 in the desktop environment-friendly layout.
FIG. 14 is a flowchart illustrating a desktop virtualization method of a mobile terminal according to an embodiment of the present disclosure.
Referring toFIG. 14, a flowchart is illustrated in a case in which no external input device is connected to themobile terminal100. However, themobile terminal100 provides a virtual input device for manipulating the data region of the external display device.
Referring toFIG. 14, atoperation1401, thecontroller170 determines the execution of the desktop mode atoperation1401. For example, thecontroller170 determines whether themobile terminal100 is operating in the desktop mode.
Atoperation1403, thecontroller170 determines whether a request for virtual input device execution is detected.
If a virtual input device execution request is detected in the desktop mode atoperation1403, thecontroller170 proceeds tooperation1405 at which thecontroller170 executes the virtual input device application. For example, thecontroller170 may receive the input requesting for the execution of the virtual input device through manipulation of the mobile terminal (e.g., menu manipulation and shortcut key input) in the state in which the default desktop window screen is displayed on theexternal display device200. Then thecontroller170 controls themobile terminal100 such that the virtual input device application is executed in the default region of themobile terminal100 in the state of maintain the default desktop window screen of theexternal display device200.
Atoperation1407, thecontroller170 displays the virtual input device on themobile terminal100 and the pointer on theexternal display device200. For example, thecontroller170 outputs the virtual input device in match with the layout of the default region of themobile terminal100 with the execution of the virtual input device application and controls to present thepointer750 in the desktop region of theexternal display device200.
Atoperation1409, thecontroller170 controls the desktop region according to the user input made through the virtual input device in the state in which the virtual input device is displayed. For example, thecontroller170 may controls the operations as described with reference toFIGS. 8,9,10,11, and12. For example, thecontroller170 may control the tasks of the desktop environment such as pointer movement in the desktop region, application execution, executed window screen movement, window size extension/shrink, size minimization, size maximization, window close, and/or the like.
FIG. 15 is a flowchart illustrating a desktop virtualization method of a mobile terminal according to an embodiment of the present disclosure.
Referring toFIG. 15, the operations of processing the input from theexternal input device300 in the state in which theexternal device300 is connected to themobile terminal100 is illustrated.
Atoperation1501, thecontroller170 determines the execution of the desktop mode.
Atoperation1503, thecontroller170 receives a control input made through theexternal input device300.
If a control input is received from theexternal input device300 in the desktop mode atoperation1503, then thecontroller170 proceeds tooperation1505 at which thecontroller170 maintains the current state of the screen of the default region of themobile terminal100.
Thereafter, atoperation1507, thecontroller170 determines the control event of theexternal input device300. For example, thecontroller170 is capable of determining whether the control input is made through the keyboard or the mouse and which task is to be executed according to the control input.
Atoperation1509, thecontroller170 controls output of the desktop region according to the identified control input. For example, thecontroller170 may control the movement of the pointer in the desktop region of theexternal display device200, application execution, executed window screen movement, window size enlargement/shrink, size minimization, size maximization, window close, or the like while maintaining the current state of the screen of themobile terminal100.
Descriptions are made of the various embodiments related to the per-display device window layout provision function hereinafter with reference toFIGS. 16,17,18, and19A,19B,19C, and19D.
FIG. 16 is a flowchart illustrating a per-display device window layout provision method of a mobile terminal according to an embodiment of the present disclosure.
Referring toFIG. 16, atoperation1601, thecontroller170 detects an input requesting to execute an application.
If thecontroller170 detects a request for an application execution input by the user atoperation1601, then thecontroller170 proceeds tooperation1603 at which thecontroller170 acquires the resource of the display device on which the application is executed. For example, if the application execution request is detected, thecontroller170 identifies the display device on which the application execution screen is to be displayed and acquires the resource of the display device. The resource of the display device is the resource of the display devices currently connected to the mobile terminal100 (e.g.,display unit130 of themobile terminal100 and external display device connected to the mobile terminal100) and acquired from thestorage unit150 or the external display device.
If thecontroller170 acquires the resource of the display device on which the application execution screen is displayed atoperation1603, then thecontroller170 proceeds tooperation1605 at which thecontroller170 determines a window layout policy configured for the acquired resource. For example, thecontroller170 may retrieve a predetermined window layout policy corresponding to the resource of the display device from the mapping table of thestorage unit150.
Atoperation1607, thecontroller170 decorates (e.g., configures) the window layout of the application according to the determined window layout policy. For example, thecontroller170 configures the layout of the window presenting the application execution screen according to the window layout policy.
Atoperation1609, thecontroller170 generates the window decorated (e.g., configured) according to the window layout on the display device and displays the application execution screen.
As described with reference toFIG. 16, when the user executes an application, thecontroller170 acquires the resource of at least one display device currently connected to themobile terminal100 and selects one of the window layout policies configured based on the resource of the display device. Thecontroller170 decorates (e.g., configures) the window layout of the executed application according to the determined window layout policy.
FIG. 17 is a flowchart illustrating a per-display device window layout management method of a mobile terminal according to an embodiment of the present disclosure.
Referring toFIG. 17, atoperation1701, thecontroller170 detects the window generation request of the user.
Atoperation1703, thecontroller170 acquires the resource of the currently connected display device. For example, if the user's application execution request is detected, thecontroller170 may determine that the application execution request corresponds to the window generation event for generating a new window. Thecontroller170 determines the display device for which the window is generated according to the application execution (e.g.,display unit130 of themobile terminal100 and external display device connected to the mobile terminal100) and acquires the resource of the determined display device. The resource of the display device may include the resource of at least one of thedisplay unit130 of the mobile terminal and the external display device connected to themobile terminal100.
Atoperation1705, thecontroller170 searches for the window layout policy for the resource of the display device.
Atoperation1707, thecontroller170 determines whether the window layout policy is retrieved. For example, thecontroller170 scans the resources of the display devices in correspondence to the resource acquired from the mapping table stored in thestorage unit150 and retrieves the window layout policy mapped to the display device of the acquired resource.
If thecontroller170 retrieves a window layout policy for the display device atoperation1707, then thecontroller170 proceeds tooperation1709 at which thecontroller170 determines that the found window layout policy corresponds to the window layout policy of the window to be display on the display device.
Atoperation1711, thecontroller170 applies the determined window layout policy to the window to be displayed on the display device.
Atoperation1713, thecontroller170 displays the window to which the window layout policy is applied on the display device. For example, thecontroller170 generates the window according to the applicable window layout policy.
Conversely, if thecontroller170 does not retrieve a window layout policy for the display device atoperation1707, then thecontroller170 proceeds tooperation1715 at which thecontroller170 determines that the default window layout policy corresponds to the window layout policy of the window to be displayed on the display device.
Atoperation1717, thecontroller170 applies the default window layout policy to the window to be displayed on the display device. Thereafter, thecontroller170 proceeds tooperation1713 at which thecontroller170 displays the window to which the window layout policy is applied on the display device.
Referring toFIG. 17, thecontroller170 may determine whether any window layout policy of the display device on which the window is to be displayed when an application execution window is generated in response to the user request exists. If the window layout policy of the display device is retrieved, thecontroller170 applies the retrieved window layout policy as the window layout policy of the display device. Otherwise, if no window layout policy is retrieved, thecontroller170 applies the default window layout policy as the window layout policy of the display device. Thecontroller170 generates the window to which the window layout policy is applied and provides the application execution screen according to the layout of the window.
FIG. 18 is a signal flow diagram illustrating signal flows between a mobile terminal and an external display device interoperating in a method according to an embodiment of the present disclosure.FIGS. 19A,19B,19C, and19D are diagrams illustrating screen displays of a mobile terminal and an external display device for explaining an interoperation there between.
Referring toFIGS. 18,19A,19B,19C, and19D, atoperation1801, themobile terminal100 is connected to theexternal display device200. For example, themobile terminal100 may detects the connection of theexternal display device200 through a wired (e.g., HDMI) link or a wireless (e.g., Wi-Fi) link in a certain operation mode (e.g., standby mode and application execution mode).
If theexternal display device200 is connected to themobile terminal100 atoperation1801, then atoperation1803, themobile terminal100 acquires the resource of theexternal display device200. For example, if theexternal display device200 is connected, themobile terminal100 acquires the resource such as screen size of theexternal device200. The resource of theexternal display device200 may be provided in such a way that themobile terminal200 request the connectedexternal display device200 to transmit the resource and theexternal display device200 provides themobile terminal100 with its resource in response to the resource transmission request. Theexternal display device200 may transmit the resource thereof to the mobile terminal automatically when theexternal display device200 is connected to themobile terminal100. According to various embodiments of the present disclosure, the resource of the external display device is acquired in the way of polling or pushing the resource at the time when themobile terminal100 is connected to theexternal display device200, the resource including the screen size of theexternal display device200. Themobile terminal100 may buffer or store the acquired resource in thestorage unit150 of themobile terminal100.
At this time, the host providing the resource may be a slave which does not perform any control function but provides screen data under the control of a master. The master may be the terminal100 responsible for the control (particularly the controller170), and the slave theexternal display device200 and thedisplay unit130 of themobile terminal100 that display the screen data provided by themobile terminal100.
Atoperation1805, themobile terminal100 determines the window layout policy corresponding to the resource acquired from theexternal display device200. For example, themobile terminal100 may determine the window layout policy corresponding to the acquired resource by referencing the predetermined mapping table of the per-display device window layout policies.
Atoperation1807, themobile terminal100 outputs the desktop window screen (e.g., background screen) to theexternal display device200 according to the determined window layout policy. When displaying the desktop window screen, themobile terminal100 transfers the desktop window screen generated internally according to the window layout policy to theexternal display device200 in the state of the current display of the mobile terminal100 (e.g., current operation mode screen display).
After being connected to themobile terminal100, if the desktop window screen is received from themobile terminal100, then atoperation1809, theexternal display device200 displays the desktop window screen. As illustrated inFIG. 19A, the desktop window screen may include thedesktop shortcut icons250 representing the pre-registered applications so as to execute operation with the window corresponding to the resource of thepredetermined background screen230 and theexternal display device200.
In the state in which theexternal display device200 is connected to themobile terminal100, atoperation1811, the mobile terminal may receive a user input for executing an application. For example, the user may select an icon representing a specific application on the screen (e.g., a current screen of themobile terminal100 inFIG. 19A) displayed currently on the display device (e.g., display unit130) of themobile terminal100 by manipulating themobile terminal100. The user also may manipulate themobile terminal100 to select an icon of a specific application (e.g.,shortcut icon250 for desktop) on the currently displayed screen of the external display device200 (e.g., desktop window screen currently displayed on theexternal display device200 inFIG. 19A).
Atoperation1813, themobile terminal100 determines the window creation region for the application in response to the application execution request. For example, if an application execution request is detected, themobile terminal100 may determine the display device on which the application execution screen is to be displayed. For example, if the application execution request is detected on the currently display screen of themobile terminal100, themobile terminal100 may determine that the window creation region is the display device of the mobile terminal100 (e.g., the display unit130), and if the application execution request is detected on the currently displayed screen of theexternal display device200, the mobile terminal may determine that the window creation region is theexternal display device200. Although the application execution request is detected on the screen of themobile terminal100 in the state in which at least oneexternal display device200 is connected, theexternal display device200 may be determined as the window creation region. The determination that the window creation region corresponds to theexternal display device200 may depend on the user configuration.
If the window creation region is determined, then atoperation1815, themobile terminal100 determines the window layout policy corresponding to the display device as the window creation region. For example, themobile terminal100 may reference the mapping table of the per-display device window layout policies to select the window layout policy corresponding to the resource of the display device as the window creation region (e.g.,display unit130 of themobile terminal100 and external display device200).FIG. 18 is directed to the case in which the window creation region for the executed application is theexternal display device200.
At operation1817, themobile terminal100 outputs the application execution screen corresponding to the determined window layout policy to theexternal display device200. For example, themobile terminal100 configures the window layout of the executed application according to the determined window layout policy and outputs the application execution screen corresponding to the window layout to which the window layout policy has been applied. For example, themobile terminal100 determines the application represented by theicon250 selected by the user among theicons250 displayed on theexternal display device200. Themobile terminal100 configures the application execution window with the layout according to the determined window layout policy and transfers the application execution screen to theexternal display device200. For example, themobile terminal100 may execute the application with the layout corresponding to the resource of the display device (e.g., the external display device200) in determining theexternal display device200 as the application window creation region and process the screen display.
If the application execution screen is received from themobile terminal100 at operation1817, then atoperation1819, theexternal display device200 displays the application execution screen. The application execution screen is provided in the form of a window screen having the layout fit for the resource of theexternal display device200 different from the window screen displayed on the display device (e.g., display unit130) of themobile terminal100 as shown inFIG. 19B. For example, the application execution screen may be implemented in the form of desktop window layout according to the resource of theexternal display device200. Referring toFIG. 19B, the desktop window may have the size larger than that displayed on themobile terminal100 along with detailed information. The further supplementary information may be provided at the top side of the frame of the window. Thestatus bar550 may present the status information (e.g., application name) on the corresponding window (e.g., execution screen). Particularly, thestatus bar550 may include the windowsize adjustment buttons555 such as executed window minimization and maximization, and window close buttons. At this time, themobile terminal100 may maintain the screen in the current state thereof. Further, the desktop window screen may include thedesktop shortcut icons250 representing the pre-registered applications so as to execute operation with the window corresponding to the resource of thepredetermined background screen230 and theexternal display device200.
FIG. 19C is a diagram illustrating screen displays of the mobile terminal and the external display device with distinct window layouts according to an embodiment of the present disclosure.
As shown inFIG. 19C, the system is configured with themobile terminal100 supporting multi-screen function in connection with theexternal display device200 displaying the screen output by themobile terminal100 in the window layout determined according to the per-display device window layout policy.
InFIG. 19C, themobile terminal100 and theexternal display device200 are connected to each other through a wired interface (e.g., HDMI) or a wireless interface (e.g., Wi-Fi). Themobile terminal100 transmits various screen data to theexternal display device200 through the wired interface (e.g., HDMI) or the wireless interface (e.g., Wi-Fi).
InFIG. 19C, a certain application is executed with the execution screen which is displayed by theexternal display device200 according to the external display device-specific window layout policy and by the internal display device of the mobile terminal100 (e.g., display unit130) according to the mobile terminal-specific window layout policy under the control of themobile terminal100
Referring toFIG. 19C, the mobile terminal in the state of being connected to theexternal display device200 may provide theexternal display device200 with the application execution screen to which the window layout policy corresponding to the resource of the external display device is applied and provide thedisplay unit130 of themobile terminal100 with the application execution screen to which the window layout policy corresponding to the resource of thedisplay unit130 of the mobile terminal (e.g., default window layout policy). For example, the application execution screen may be provided in different window layouts determined according to the display device-specific window layout policies.
If an application execution request is detected on the region of theexternal display device200, the mobile terminal executes the application in the window format based on the window layout policy matching the resource of theexternal display device200 and processes the output of the corresponding window. If the application execution request is detected on the region of thedisplay unit130 of themobile terminal100, themobile terminal100 executes the application in the window format based on the window layout policy matching the resource of thedisplay unit130 and processes the out of the corresponding window. In the case of generating the window according to the application execution at theexternal display device200, themobile terminal100 formats the application execution screen in match with the resource of theexternal display device200. In the case of generating the window according to the application execution at themobile terminal100, themobile terminal100 formats the application execution screen in match with the resource of thedisplay unit130.
FIG. 19D is a diagram illustrating screen displays of the mobile terminal and the external display device with distinct window layouts according to an embodiment of the present disclosure.
FIG. 19D shows the screen displays in the state in which themobile terminal100 and theexternal display device200 are connected through a wired link (e.g., HDMI) or a wireless link (e.g., Wi-Fi) and the execution screen of a certain application is displayed across the region of themobile terminal100 and the region of theexternal display device200. For example, in the state in which the application is executed with the execution window displayed on the region of theexternal display device200, the application execution window is moved to the region of themobile terminal100 in response to the user's window move manipulation such that the application execution window appears partially on the region of themobile terminal100.
As shown inFIG. 19D, a part of the execution screen of an application may be displayed on the region of themobile terminal100 as formatted in the layout according to the window layout policy of thedisplay unit130 of the mobile terminal, while the other part of the execution screen is displayed on the region of theexternal display device200 as formatted in the layout according to the window layout policy of theexternal display device200. For example, the execution window of the same application may be display in distinct layouts depending on the display device-specific window layout. In a case of a photo gallery application, the application execution window for theexternal display device200 may be displayed in the layout having a list of photos and detailed description of a selected photo (e.g., enlarged photo), and the application execution window for themobile terminal100 may be displayed in the layout having only the list of the photos.
Although not shown inFIG. 19D, the user may manipulate such that the application execution window is moved from the region of one display device to the region of the other display device completely. In this case, the application execution window may be changed in layout from the window layout policy of the old display device to the window layout policy of the new display device. For example, if the application execution window is moved from the region of theexternal display device200 to the region of themobile terminal100 completely according to the user's manipulation, the application execution window may be formatted in the window layout according to the mobile terminal-specific window layout policy. In contrast, if the application execution window is moved from the region of themobile terminal100 to the region of theexternal display device200 completely, the application execution window may be formatted in the window layout according to the external display device-specific window layout policy.
Descriptions are made of the various embodiments related to the display function of the external display device in response to the user input in detail hereinafter with reference toFIGS. 20 to 24.
FIG. 20 is a flowchart illustrating a method for processing a user input according to an embodiment of the present disclosure.
Referring toFIG. 20, atoperation2010, thecontroller170 determines that themobile terminal100 is operating in the single display mode.
Atoperation2015, thecontroller170 determines whether any external display device is connected to themobile terminal100. For example, the external display device may be connected to themobile terminal100 through the radio communication unit or theinterface unit160.
If thecontroller170 determines that an external display device is connected to themobile terminal100, then thecontroller170 proceeds tooperation2020 at which thecontroller170 controls themobile terminal100 to operate in the dual display mode. Atoperation2020, if the external display device is connected to themobile terminal100, thecontroller170 sends the external display device a message requesting for screen size information (e.g., resolution information). In response to the request, the external device sends the mobile terminal100 a response message including the screen size information. Thedisplay unit130 displays a first home screen including an icon of a virtual controller under the control of thecontroller170. The external display device displays a second home screen under the control of thecontroller170. At this time, the resolution of the second home screen is determined based on the screen size information received from the external display device. For example, the second home screen is formatted to match the screen size of the external display device. If the user makes a tap gesture to the icon of the virtual controller on the first home screen, thedisplay unit130 generates an input signal corresponding to the tap gesture to thecontroller170. Thecontroller170 receives the input signal and executes the virtual controller in response to the input signal. The virtual controller requests the kernel to generate a virtual touchscreen. The kernel, particularly the U-input module, generates a virtual touchscreen for emulating thedisplay unit130 as the input device of the external display device. Meanwhile, thedisplay unit130 may display the execution screen of the virtual controller (e.g., emulation screen), under the control of thecontroller170. At this time, the emulation screen includes a touch region. The emulation screen also may include at least one soft key (e.g., left mouse button, right mouse button, minimization button, maximization button, and close button). The emulation screen may be displayed in the form of an overlaid window. In addition, the emulation screen may be displayed on the second screen region while another screen is displayed in the first screen region. The emulation screen is closed in response to a user's request. If the user selects the corresponding icon on the first home screen, the emulation screen appears again. The virtual controller may be executed automatically when the external display device is connected to themobile terminal100. The auto-execution function may be turned on/off by the user. In the case in which the emulation screen is displayed, thecontroller170 controls the external display device to display a pointer on the other screen (e.g., the second home screen) in the form of overlay. If the emulation screen function is terminated, the pointer disappears.
Atoperation2030, thecontroller170 monitors to detect a user input.
Atoperation2035, thecontroller170 determines whether a user input is detected on thedisplay unit130. The user input may be any of touch, multi-touch, tap, double tap, long tap, drag, flick, press, pinch-in, pinch-out, and/or the like.
If the controller determines that the user input is not detected on thedisplay unit130 atoperation2035, then thecontroller170 proceeds tooperation2030.
If thecontroller170 determines that the user input is detected on thedisplay unit130 atoperation2035, then thecontroller170 proceeds tooperation2040 at which thecontroller170 determines whether the user input is related to the external display device. For example, if no emulation screen is displayed, thecontroller170 determines that the user input is not related to the external display device. Otherwise, if the emulation screen is displayed, thecontroller170 determines that the user input is related to the external display device. If the user input is detected on the first screen region in the state in which the emulation screen is displayed on the first screen region and another screen is displayed on the second screen region, thecontroller170 determines that the user input is related to the external display device. If the user input is detected on the second screen region, thecontroller170 determines that the user input is not related to the external display device.
If thecontroller170 determines that the user input is related to the external display device atoperation2040, then the controller proceeds tooperation2050 at which thecontroller170 performs a function related to the screen of the external display device in response to the user input. For example, if a drag gesture is detected on the touch region of the emulation screen, thecontroller170 moves the pointer in the drag direction. For example, the external display device shows the pointer in motion under the control of thecontroller170. If a tap gesture is detected on the touch region of the emulation screen or a left mouse button is clicked in the state in which in pointer is placed on touch region, thecontroller170 executes the application represented by the icon on which the pointer is placed and controls the external display device to display the execution screen. If the user input is the “tap & touch followed by drag,” thecontroller170 moves the pointer along with the execution window on which the pointer is placed. As described above with reference tooperation2050, thedisplay unit130 is used as the input device for the external display device.
If thecontroller170 determines that the user input is not related to the screen of the external display device (e.g., if the user input is related to the screen of the display unit130) atoperation2040, then thecontroller170 proceeds tooperation2060 at which thecontroller170 performs a function related to the screen of thedisplay unit130 in response to the user input. For example, if the user input is a drag, thecontroller170 changes the screen according to the drag direction. For example, the first page of the first home screen may be replaced by the second page. If the user input is a tap, thecontroller170 executes the application represented by the icon tapped and controls thedisplay unit130 to display the execution screen.
Atoperation2070, thecontroller170 determines whether a dual display mode termination request is input. For example, when the connection between the external display device and themobile terminal100 is released, the dual display mode is terminated. Otherwise, thecontroller170 returns tooperation2030.
FIG. 21 is a diagram illustrating software architecture for explaining a procedure of moving a pointer on an external display device according to an embodiment of the present disclosure.FIG. 22 is a diagram illustrating screen displays of a mobile terminal and an external display device for explaining a procedure of moving a pointer on an external display device according to an embodiment of the present disclosure.
Referring toFIG. 21, thecontroller170 includes avirtual controller310, aU-input module320, avirtual touchscreen330, avirtual multi-touchscreen340, anXserver350, and anapplication360.
Atoperation 1, thevirtual controller310 is executed.
Atoperation 2, thevirtual controller310 controls thedisplay unit130 to display an emulation screen including an emulation region410, aleft mouse button420, aright mouse button430, aminimization button440, amaximization button450, and anend button460.
Atoperation 3, thevirtual controller310 transmits a request for a virtual touch screen creation to theU-input module320.
Atoperation 4, theU-input module320 creates a virtual touch screen and a virtual multi-touch touch screen.
Atoperation 5, the touch coordinates (xn, yn) are transferred to theXserver350.
Atoperation 6, theXserver350 forwards the touch coordinates (xn, yn) to thevirtual controller310.
At operation 7, thevirtual controller310 determines whether the (xn, yn) is the coordinates of the emulation region410.
If the touch coordinates (xn, yn) are the coordinates of the emulation region410, then at operation 8, thevirtual controller310 calculates the touch displacement (dx(xn-xn-1), dy(yn-yn-1)). (xn-1, yn-1) are the previously detected touch coordinates.
Atoperation 9, thevirtual controller310 sends the position displacement (dx, dy) to theU-input module320.
Atoperation 10, theU-input module320 sends the position displacement (dx, dy) to thevirtual touchscreen330.
Atoperation 11, thevirtual touchscreen330 sends the position displacement (dx, dy) to theXserver350.
Atoperation12, theXserver350 sends the position displacement (dx, dy) to theapplication360 associated with the external display device. The application moves thepointer470 from thefirst position480 to thesecond position490 in response to the position displacement (dx, dy) as shown inFIG. 22.
FIG. 23 is a diagram illustrating software architecture for explaining a procedure of changing a size of an image displayed on an external display device according to an embodiment of the present disclosure.FIG. 24 is a diagram illustrating screen displays of a mobile terminal and an external display device for explaining a procedure of changing a size of an image displayed on an external display device according to an embodiment of the present disclosure.
Referring toFIG. 23, thecontroller170 includes avirtual controller510, aU-input module520, avirtual touch screen530, a virtualmulti-touch screen540, anXserver550, and anapplication560.
Operations 1 to 4 ofFIG. 23 are identical with those ofFIG. 21.
Atoperation 5, the first touch coordinates (x1, y1) and the second touch coordinates (x2, xy) are sent to theXserver550.
Atoperation 6, theXserver550 sends the first and second touch coordinates (x1, y1) and (x2, y2) to thevirtual controller510.
At operation 7, thevirtual controller510 determines whether the first and second touch coordinates (x1, y1) and (x2, y2) are the coordinates in theemulation region610.
If the first and second touch coordinates (x1, y1) and (x2, y2) are the coordinates in theemulation region610, then at operation 8, thevirtual controller510 determines the size (A) of theemulation region610 and the size (B) of the execution window620 (seeFIG. 6) and respectively converts the first and second touch coordinates (x1, y1) and (x2, y2) to the third and fourth touch coordinates (x3, y3) and (x4, y4).
Atoperation 9, thevirtual controller510 sends the third and fourth touch coordinates (x3, y3) and (x4, y4) to theU-input module520.
Atoperation 10, theU-input module520 sends the virtual multi-touchscreen540 the third and fourth touch coordinates (x3, y3) and (x4, y4).
Atoperation 11, thevirtual multi-touchscreen540 sends theXserver550 the third and fourth touch coordinates (x3, y3) and (x4, y4).
Atoperation 12, theXserver550 sends the application related to the external display device the third and fourth touch coordinates (x3, y3) and (x4, y4). Theapplication560 changes (e.g., enlarges) the size of theexecution window620 based on the third and fourth touch coordinates (x3, y3) and (x4, y4).
As described above, according to various embodiments of the present disclosure, the desktop virtualization method and apparatus of the mobile terminal provide a desktop virtualization function capable of allowing the mobile terminal to perform desktop environment operation in interoperation with the an external display device. Accordingly, the desktop virtualization method and apparatus according to various embodiments of the present disclosure are capable of providing the user with the desktop window environment for presenting more information and meeting various user requirements. According to various embodiments of the present disclosure, the desktop virtualization method and apparatus are capable resolving the inconvenience caused by the size-constrained screen of the mobile device and allowing the user to use the mobile terminal in the desktop environment on the large screen of the connected external display device.
According to various embodiments of the present disclosure, the desktop virtualization method and apparatus are capable of providing an optimized desktop window environment to the mobile terminal user so as to improve the user convenience and usability and competitiveness of the mobile terminal. According to various embodiments of the present disclosure, the desktop virtualization method and apparatus are applicable to all the types of the mobile terminals and equivalent devices supporting multi-screen function.
According to various embodiments of the present disclosure, the per-display device window layout provision method and apparatus of the mobile terminal are capable of providing the user with the screen display in various window layouts fit for various display devices. According to various embodiments of the present disclosure, the method and apparatus are capable of providing the display device-specific screen layouts so as to improve the user convenience.
According to various embodiments of the present disclosure, the per-display device window layout provision method and apparatus of the present disclosure are capable of configuring distinct window layout policies corresponding to at least two display devices such that an application is executed the execution screen of an application is formed in the display device-specific window layout according to the window layout policy matching the display device. According to various embodiments of the present disclosure, the method and apparatus are capable resolving the inconvenience caused by the size-constrained screen of the mobile device and allowing the user to use the mobile terminal in the desktop environment on the large screen of the connected external display device.
According to various embodiments of the present disclosure, the method and apparatus are capable of providing optimal environment for displaying the screen in the format matching the display device so as to improve the user convenience and usability and competitiveness of the mobile terminal. According to various embodiments of the present disclosure, the method and apparatus are applicable to all the types of the mobile terminals and equivalent devices supporting multi-screen function.
According to various embodiments of the present disclosure, the user input processing method and apparatus of the mobile terminal interoperating with an external display device are capable of providing the user with a dual display mode environment without extra external input device. According to various embodiments of the present disclosure, the method and apparatus are capable of providing the user with the dual screen mode environment using a touchscreen of the mobile terminal and an external display device.
The above-described various embodiments of the present disclosure can be implemented in the form of computer-executable program commands and stored in a non-transitory computer-readable storage medium. The non-transitory computer readable storage medium may store the program commands, data files, and data structures in individual or combined forms. The program commands recorded in the storage medium may be designed and implemented for various embodiments of the present disclosure or used by those skilled in the computer software field.
The non-transitory computer-readable storage medium includes magnetic media such as a floppy disk and a magnetic tape, optical media including a Compact Disc (CD) ROM and a Digital Video Disc (DVD) ROM, a magneto-optical media such as a floptical disk, and the hardware device designed for storing and executing program commands such as ROM, RAM, flash memory, and the like. The programs commands include the language code executable by computers using the interpreter as well as the machine language codes created by a compiler. The aforementioned hardware device can be implemented with one or more software modules for executing the operations of the various embodiments of the present disclosure.
While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.