CROSS-REFERENCE TO RELATED APPLICATIONPursuant to 35 U.S.C. § 119(a), this application claims the benefit of earlier filing date and right of priority to Korean Application Nos. 10-2013-0139290, filed on Nov. 15, 2013, and 10-2014-0069547, filed on Jun. 9, 2014, the contents of which are hereby incorporated by reference herein in their entirety.
BACKGROUND OF THE INVENTION1. Field of the Invention
The present disclosure relates to a mobile terminal including a lateral touch sensing unit configured to receive a touch input for displaying visual information thereof.
Furthermore, the present disclosure relates to a mobile terminal including a display unit configured to receive a touch input for displaying visual information thereof.
2. Description of the Related Art
Mobile terminal is a portable electronic device having at least one of a voice and video communication function, an information input and output function, a data storage function, and the like.
As it becomes multifunctional, the mobile terminal can be allowed to capture still images or moving images, play music or video files, play games, receive broadcast and the like, so as to be implemented as an integrated multimedia player.
For the design and functional modifications of the terminal, studies on modifying or extending a structure to which a user's control command is applied have been carried out. In recent years, structures in which a display area is extended on the mobile terminal have been developed.
SUMMARY OF THE INVENTIONAccordingly, the technical task of the present disclosure is to propose a control method for applying a control command to a touch sensing unit formed on a lateral surface of a mobile terminal to control the mobile terminal.
In order to accomplish the foregoing task of the present disclosure, there is provided a mobile terminal, including a body having a front surface, a rear surface and a lateral surface thereof, a display unit disposed on the front surface to display first screen information, a lateral touch sensing unit formed on the lateral surface adjacent to both edges of the display unit to receive a user's consecutive touch inputs, and a controller configured to control the display unit to display second screen information in one region on the display unit corresponding to a touch range of the consecutive touch input.
According to an example associated with the present disclosure, the first and the second screen information may correspond to the execution screens of a first and a second application which are distinguished from each other.
According to an example associated with the present disclosure, the first and the second application may be being executed, and have been sequentially activated.
According to an example associated with the present disclosure, the display unit may display the first and the second screen information in separate regions on the display unit, and the controller may control the first application or the second application based on a touch input applied to the first screen information or second screen information.
According to an example associated with the present disclosure, the controller may control the display unit to display an icon of a recommended application associated with the first screen information when a first application is initially or finally activated among the applications being executed.
According to an example associated with the present disclosure, the controller may electively displays screen information displayed in a larger region between the first and the second screen information on the display unit when a touch input applied to the lateral touch sensing unit is released.
According to an example associated with the present disclosure, the first screen information may include at least one content, and the second screen information may be formed with visual information associated with content selected from the contents.
According to an example associated with the present disclosure, the controller may select content displayed in one region on the display unit corresponding to the touch range among a plurality of contents contained in the first screen information.
According to an example associated with the present disclosure, the visual information may correspond to a menu image for receiving a touch input to edit the content.
According to an example associated with the present disclosure, the visual information may correspond to a menu image for receiving a touch input to execute an application associated with the content.
According to an example associated with the present disclosure, the visual information may correspond to lower information associated with the content, and the controller may control the display unit to display some of the plurality of lower information in correspondence to the touch range.
According to an example associated with the present disclosure, the first screen information may correspond to a home screen page containing at least one icon corresponding to an application, and the controller may control the display unit to select at least one icon based on the touch range, and display lower information associated with the application of the selected icon.
According to an example associated with the present disclosure, the touch sensing unit may include a display unit configured to display an image, and the controller may display a notification image in one region of the touch sensing unit for receiving a touch input to display the second screen information.
According to an example associated with the present disclosure, when the first screen information contains a plurality of contents, and the second screen information is visual information associated with one content, the notification image may be displayed in one region on the lateral touch sensing unit adjacent to the content.
In order to accomplish the foregoing task of the present disclosure, there is provided a method of controlling a mobile terminal according to another embodiment disclosed in the present disclosure, and the method may include displaying first screen information on the display unit, sensing a consecutive touch input applied to a touch sensing unit connected to both lateral surfaces of the display unit, and displaying second screen information in one region on the display unit corresponding to a touch range due to the consecutive touch input.
According to an example associated with the present disclosure, the method may further include storing a sequence in which a plurality of applications being executed are activated, wherein the second screen information corresponds to one of the execution screens of the applications selected based on the sequence.
Still another technical task of the present disclosure is to propose a control method of applying a control command to a display unit formed on a lateral surface of a terminal to control a mobile terminal.
Yet still another technical task of the present disclosure is to display screen information having at least two or more display directions at the same time on the display unit.
In order to accomplish the foregoing task of the present disclosure, there is provided a mobile terminal, including a body having a front surface and a lateral surface thereof, a display unit comprising a first region disposed on the front surface of the body and a second region connected to the first region and disposed on the lateral surface, and a controller configured to display information associated with the screen information as screen information having a direction different from the displayed direction of the screen information in at least part of the first region when a preset type of touch is sensed on the second region in a state that screen information associated with a function being executed by the terminal is displayed in the first region, wherein the screen information having the different direction is displayed in at least part of the first region corresponding to the preset type of touch on the second region.
According to an embodiment, when the screen information have a first display direction, the screen information having the different direction may be screen information to be displayed in the first region when the screen information is displayed in a second display direction different from the first display direction.
According to an embodiment, the controller may display a region containing the same information as the screen information having the first display direction in a visually distinguished manner among the screen information having the second display direction.
According to an embodiment, the controller may control the display unit to display the screen information having the first display direction along with the screen information having the second display direction.
According to an embodiment, when a touch is applied to any one position of a region displayed with the screen information having the second display direction, the controller may scroll the screen information having the first display direction such that screen information corresponding to the any one position of the screen information having the first display direction moves to the center of the first region.
According to an embodiment, the second region may include a first sub-region disposed on either one of both lateral surfaces of the body and a second sub-region disposed on the other lateral surface, and when touches applied to the first and the second sub-region at the same time are sensed, the controller may display information associated with the screen information to have a different direction from the display direction of the screen information in a region corresponding to a region of the first region to which the drag is applied.
According to an embodiment, when a touch with respect to at least two regions is sensed on either one lateral surface of the second region provided on the both lateral surfaces, the controller may display information having the different direction in a region corresponding to a region in which the touch is sensed.
According to an embodiment, the controller may determine a direction in which information associated with the screen information is displayed according to a direction to which the preset type of touch is applied.
According to an embodiment, the second region may include a first sub-region disposed on either one of both lateral surfaces of the body and a second sub-region disposed on the other lateral surface, and the controller may determine a location at which information associated with the screen information is displayed based on drag inputs being sensed in different directions with respect to the first and the second sub-region, respectively.
According to an embodiment, screen information displayed in the first region may maintain its display location independently from information associated with the screen information being displayed in at least part of the first region.
According to an embodiment, the mobile terminal may further include at least two camera units having different capture directions, wherein when a preset type of touch is sensed in the second region in a state that an image received from either one of the at least one camera units is displayed on the display unit, the controller activates a camera unit having a different capture direction from that of the other one, and displays an image received from the camera unit having the different capture direction on at least part of the display unit.
According to an embodiment, at least two images received from the camera units having different capture directions from each other may be displayed at the same time on the display unit, and the controller may capture either one of the at least two images having different capture directions based on a user's request.
According to an embodiment, the controller may analyze screen information displayed in the first region, and determine at least part of the screen information displayed in a different direction from a current display direction when a preset type of touch is sensed in the second region.
According to an embodiment, the controller may no longer display information associated with the screen information in response to a touch being applied in an opposite direction to that of the preset type of touch in a state that the screen information having different display directions and information associated with the screen information are displayed at the same time in the first region.
A method of controlling a mobile terminal having a display unit on a front surface and a lateral surface thereof may include sensing a preset type of touch with respect to a second region disposed on the lateral surface in a state that screen information associated with a function being executed in a terminal is displayed in a first region disposed on the front surface, and displaying information associated with screen information displayed in the first region to have a different direction from the displayed direction of the screen information in at least part of the first region in response to the preset type of touch, wherein the display location of the information having the different direction is determined according to a direction to which a preset type of touch with respect to the second region is applied.
According to an embodiment, the information to be displayed in a different direction may be displayed in a region corresponding to a region of the first region to which the preset type of touch is applied.
According to an embodiment, when the screen information has a first display direction, the information having the different direction may be information to be displayed in the first region when the screen information is displayed in a second display direction different from the first display direction.
According to an embodiment, a region containing the same information as the screen information having the first display direction among the information having the second display direction may be displayed in a visually distinguished manner from other information.
According to an embodiment, the screen information having the first display direction and the information having the second display direction may be displayed at the same time.
According to an embodiment, the information having the second display direction may be displayed to be overlapped with at least part of the screen information displayed in the first display direction.
BRIEF DESCRIPTION OF THE DRAWINGSThe accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention.
In the drawings:
FIG. 1A is a block diagram illustrating a mobile terminal associated with the present disclosure;
FIGS. 1B and 1C conceptual views in which an example of a mobile terminal associated with the present disclosure is seen from different directions;
FIGS. 2A and 2B are conceptual views illustrating a communication system in which a mobile terminal according to the present disclosure is operable;
FIGS. 3A and 3B are conceptual views illustrating a mobile terminal having a lateral display unit;
FIG. 4 is a flow chart for explaining a control method of a mobile terminal according to an embodiment of the present disclosure;
FIGS. 5A(a) to5D(c) are conceptual views for explaining a control method of a mobile terminal according to various embodiments of the present disclosure;
FIGS. 6A(a) to6B(c) are conceptual views for explaining a control method of displaying additional information on one content selected from a plurality of contents contained in screen information;
FIGS. 7A(a) to7C(c) are conceptual views for explaining a control method of displaying lower information of the selected content;
FIGS. 8A to 8C are a conceptual view for explaining a control method according to a touch scheme of a touch input applied to touch sensing units at both sides thereof;
FIG. 9 is a flow chart illustrating a method of controlling both lateral surface of the display unit according to the present disclosure;
FIGS. 10A(a) to10D(b) are conceptual views illustrating the control method ofFIG. 9;
FIGS. 11A(a) to11C(c) are conceptual views for explaining a relationship between screen information associated with each other;
FIGS. 12A(a) to12B(b) are conceptual views for explaining an embodiment of displaying screen information having a second display direction;
FIGS. 13A(a) to13B(b) are conceptual views illustrating a method of determining at least one of at least some of the display locations of screen information having the first and the second display direction;
FIGS. 14A(a) to14D(b) are conceptual views illustrating a type of screen information having a second display direction;
FIGS. 15A(a) to15B(c) are conceptual views illustrating a method of displaying images received from cameras having different capture directions; and
FIGS. 16A to 16B are a conceptual view illustrating that a specific region of the screen information having a first display direction is set to screen information having a second display direction.
DETAILED DESCRIPTION OF THE INVENTIONDescription will now be given in detail according to exemplary embodiments disclosed herein, with reference to the accompanying drawings. For the sake of brief description with reference to the drawings, the same or equivalent components may be provided with the same or similar reference numbers, and description thereof will not be repeated. In general, a suffix such as “module” and “unit” may be used to refer to elements or components. Use of such a suffix herein is merely intended to facilitate description of the specification, and the suffix itself is not intended to give any special meaning or function. In the present disclosure, that which is well-known to one of ordinary skill in the relevant art has generally been omitted for the sake of brevity. The accompanying drawings are used to help easily understand various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed to extend to any alterations, equivalents and substitutes in addition to those which are particularly set out in the accompanying drawings.
It will be understood that although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another.
It will be understood that when an element is referred to as being “connected with” another element, the element can be connected with the other element or intervening elements may also be present. In contrast, when an element is referred to as being “directly connected with” another element, there are no intervening elements present.
A singular representation may include a plural representation unless it represents a definitely different meaning from the context. Terms such as “include” or “has” are used herein and should be understood that they are intended to indicate an existence of several components, functions or steps, disclosed in the specification, and it is also understood that greater or fewer components, functions, or steps may likewise be utilized.
Mobile terminals presented herein may be implemented using a variety of different types of terminals. Examples of such terminals include cellular phones, smart phones, user equipment, laptop computers, digital broadcast terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), navigators, portable computers (PCs), slate PCs, tablet PCs, ultra books, wearable devices (for example, smart watches, smart glasses, head mounted displays (HMDs)), and the like.
By way of non-limiting example only, further description will be made with reference to particular types of mobile terminals. However, such teachings apply equally to other types of terminals, such as those types noted above. In addition, these teachings may also be applied to stationary terminals such as digital TV, desktop computers, and the like.
Reference is now made toFIGS. 1A-1C, whereFIG. 1A is a block diagram of a mobile terminal in accordance with the present disclosure, andFIGS. 1B and 1C are conceptual views of one example of the mobile terminal, viewed from different directions.
Themobile terminal100 is shown having components such as awireless communication unit110, aninput unit120, asensing unit140, anoutput unit150, aninterface unit160, amemory170, acontroller180, and apower supply unit190. It is understood that implementing all of the illustrated components is not a requirement, and that greater or fewer components may alternatively be implemented.
Referring now toFIG. 1A, themobile terminal100 is shown havingwireless communication unit110 configured with several commonly implemented components. For instance, thewireless communication unit110 typically includes one or more components which permit wireless communication between themobile terminal100 and a wireless communication system or network within which the mobile terminal is located.
Thewireless communication unit110 typically includes one or more modules which permit communications such as wireless communications between themobile terminal100 and a wireless communication system, communications between themobile terminal100 and another mobile terminal, communications between themobile terminal100 and an external server. Further, thewireless communication unit110 typically includes one or more modules which connect themobile terminal100 to one or more networks. To facilitate such communications, thewireless communication unit110 includes one or more of abroadcast receiving module111, amobile communication module112, awireless Internet module113, a short-range communication module114, and alocation information module115.
Theinput unit120 includes acamera121 for obtaining images or video, amicrophone122, which is one type of audio input device for inputting an audio signal, and a user input unit123 (for example, a touch key, a push key, a mechanical key, a soft key, and the like) for allowing a user to input information. Data (for example, audio, video, image, and the like) is obtained by theinput unit120 and may be analyzed and processed bycontroller180 according to device parameters, user commands, and combinations thereof.
Thesensing unit140 is typically implemented using one or more sensors configured to sense internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information, and the like. For example, inFIG. 1A, thesensing unit140 is shown having aproximity sensor141 and anillumination sensor142.
If desired, thesensing unit140 may alternatively or additionally include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared (IR) sensor, a finger scan sensor, a ultrasonic sensor, an optical sensor (for example, camera121), amicrophone122, a battery gauge, an environment sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor, and a gas sensor, among others), and a chemical sensor (for example, an electronic nose, a health care sensor, a biometric sensor, and the like), to name a few. Themobile terminal100 may be configured to utilize information obtained from sensingunit140, and in particular, information obtained from one or more sensors of thesensing unit140, and combinations thereof.
Theoutput unit150 is typically configured to output various types of information, such as audio, video, tactile output, and the like. Theoutput unit150 is shown having adisplay unit151, anaudio output module152, ahaptic module153, and anoptical output module154.
Thedisplay unit151 may have an inter-layered structure or an integrated structure with a touch sensor in order to facilitate a touch screen. The touch screen may provide an output interface between themobile terminal100 and a user, as well as function as theuser input unit123 which provides an input interface between themobile terminal100 and the user.
Theinterface unit160 serves as an interface with various types of external devices that can be coupled to themobile terminal100. Theinterface unit160, for example, may include any of wired or wireless ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input/output (I/O) ports, video I/O ports, earphone ports, and the like. In some cases, themobile terminal100 may perform assorted control functions associated with a connected external device, in response to the external device being connected to theinterface unit160.
Thememory170 is typically implemented to store data to support various functions or features of themobile terminal100. For instance, thememory170 may be configured to store application programs executed in themobile terminal100, data or instructions for operations of themobile terminal100, and the like. Some of these application programs may be downloaded from an external server via wireless communication. Other application programs may be installed within themobile terminal100 at time of manufacturing or shipping, which is typically the case for basic functions of the mobile terminal100 (for example, receiving a call, placing a call, receiving a message, sending a message, and the like). It is common for application programs to be stored in thememory170, installed in themobile terminal100, and executed by thecontroller180 to perform an operation (or function) for themobile terminal100.
Thecontroller180 typically functions to control overall operation of themobile terminal100, in addition to the operations associated with the application programs. Thecontroller180 may provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are input or output by the various components depicted inFIG. 1A, or activating application programs stored in thememory170. As one example, thecontroller180 controls some or all of the components illustrated inFIGS. 1A-1C according to the execution of an application program that have been stored in thememory170.
Thepower supply unit190 can be configured to receive external power or provide internal power in order to supply appropriate power required for operating elements and components included in themobile terminal100. Thepower supply unit190 may include a battery, and the battery may be configured to be embedded in the terminal body, or configured to be detachable from the terminal body.
Referring still toFIG. 1A, various components depicted in this figure will now be described in more detail. Regarding thewireless communication unit110, thebroadcast receiving module111 is typically configured to receive a broadcast signal and/or broadcast associated information from an external broadcast managing entity via a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or morebroadcast receiving modules111 may be utilized to facilitate simultaneously receiving of two or more broadcast channels, or to support switching among broadcast channels.
Themobile communication module112 can transmit and/or receive wireless signals to and from one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. Such network entities form part of a mobile communication network, which is constructed according to technical standards or communication methods for mobile communications (for example, Global System for Mobile Communication (GSM), Code Division Multi Access (CDMA), CDMA2000(Code Division Multi Access 2000), EV-DO (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet access (HSDPA), HSUPA (High Speed Uplink Packet Access), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced), and the like). Examples of wireless signals transmitted and/or received via themobile communication module112 include audio call signals, video (telephony) call signals, or various formats of data to support communication of text and multimedia messages.
Thewireless Internet module113 is configured to facilitate wireless Internet access. This module may be internally or externally coupled to themobile terminal100. Thewireless Internet module113 may transmit and/or receive wireless signals via communication networks according to wireless Internet technologies.
Examples of such wireless Internet access include Wireless LAN (WLAN), Wireless Fidelity (Wi-Fi), Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), HSUPA (High Speed Uplink Packet Access), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced), and the like. Thewireless Internet module113 may transmit/receive data according to one or more of such wireless Internet technologies, and other Internet technologies as well.
In some embodiments, when the wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, as part of a mobile communication network, thewireless Internet module113 performs such wireless Internet access. As such, theInternet module113 may cooperate with, or function as, themobile communication module112.
The short-range communication module114 is configured to facilitate short-range communications. Suitable technologies for implementing such short-range communications include BLUETOOTH™, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBee, Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus), and the like. The short-range communication module114 in general supports wireless communications between themobile terminal100 and a wireless communication system, communications between themobile terminal100 and anothermobile terminal100, or communications between the mobile terminal and a network where another mobile terminal100 (or an external server) is located, via wireless area networks. One example of the wireless area networks is a wireless personal area networks.
In some embodiments, another mobile terminal (which may be configured similarly to mobile terminal100) may be a wearable device, for example, a smart watch, a smart glass or a head mounted display (HMD), which is able to exchange data with the mobile terminal100 (or otherwise cooperate with the mobile terminal100). The short-range communication module114 may sense or recognize the wearable device, and permit communication between the wearable device and themobile terminal100. In addition, when the sensed wearable device is a device which is authenticated to communicate with themobile terminal100, thecontroller180, for example, may cause transmission of data processed in themobile terminal100 to the wearable device via the short-range communication module114. Hence, a user of the wearable device may use the data processed in themobile terminal100 on the wearable device. For example, when a call is received in themobile terminal100, the user may answer the call using the wearable device. Also, when a message is received in themobile terminal100, the user can check the received message using the wearable device.
Thelocation information module115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. As an example, thelocation information module115 includes a Global Position System (GPS) module, a Wi-Fi module, or both. If desired, thelocation information module115 may alternatively or additionally function with any of the other modules of thewireless communication unit110 to obtain data related to the position of the mobile terminal.
As one example, when the mobile terminal uses a GPS module, a position of the mobile terminal may be acquired using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information related to a wireless access point (AP) which transmits or receives a wireless signal to or from the Wi-Fi module.
Theinput unit120 may be configured to permit various types of input to themobile terminal120. Examples of such input include audio, image, video, data, and user input. Image and video input is often obtained using one ormore cameras121.Such cameras121 may process image frames of still pictures or video obtained by image sensors in a video or image capture mode. The processed image frames can be displayed on thedisplay unit151 or stored inmemory170. In some cases, thecameras121 may be arranged in a matrix configuration to permit a plurality of images having various angles or focal points to be input to themobile terminal100. As another example, thecameras121 may be located in a stereoscopic arrangement to acquire left and right images for implementing a stereoscopic image.
Themicrophone122 is generally implemented to permit audio input to themobile terminal100. The audio input can be processed in various manners according to a function being executed in themobile terminal100. If desired, themicrophone122 may include assorted noise removing algorithms to remove unwanted noise generated in the course of receiving the external audio.
Theuser input unit123 is a component that permits input by a user. Such user input may enable thecontroller180 to control operation of themobile terminal100. Theuser input unit123 may include one or more of a mechanical input element (for example, a key, a button located on a front and/or rear surface or a side surface of themobile terminal100, a dome switch, a jog wheel, a jog switch, and the like), or a touch-sensitive input, among others. As one example, the touch-sensitive input may be a virtual key or a soft key, which is displayed on a touch screen through software processing, or a touch key which is located on the mobile terminal at a location that is other than the touch screen. On the other hand, the virtual key or the visual key may be displayed on the touch screen in various shapes, for example, graphic, text, icon, video, or a combination thereof.
Thesensing unit140 is generally configured to sense one or more of internal information of the mobile terminal, surrounding environment information of the mobile terminal, user information, or the like. Thecontroller180 generally cooperates with the sendingunit140 to control operation of themobile terminal100 or execute data processing, a function or an operation associated with an application program installed in the mobile terminal based on the sensing provided by thesensing unit140. Thesensing unit140 may be implemented using any of a variety of sensors, some of which will now be described in more detail.
Theproximity sensor141 may include a sensor to sense presence or absence of an object approaching a surface, or an object located near a surface, by using an electromagnetic field, infrared rays, or the like without a mechanical contact. Theproximity sensor141 may be arranged at an inner region of the mobile terminal covered by the touch screen, or near the touch screen.
Theproximity sensor141, for example, may include any of a transmissive type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared rays proximity sensor, and the like. When the touch screen is implemented as a capacitance type, theproximity sensor141 can sense proximity of a pointer relative to the touch screen by changes of an electromagnetic field, which is responsive to an approach of an object with conductivity. In this case, the touch screen (touch sensor) may also be categorized as a proximity sensor.
The term “proximity touch” will often be referred to herein to denote the scenario in which a pointer is positioned to be proximate to the touch screen without contacting the touch screen. The term “contact touch” will often be referred to herein to denote the scenario in which a pointer makes physical contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, such position will correspond to a position where the pointer is perpendicular to the touch screen. Theproximity sensor141 may sense proximity touch, and proximity touch patterns (for example, distance, direction, speed, time, position, moving status, and the like).
In general,controller180 processes data corresponding to proximity touches and proximity touch patterns sensed by theproximity sensor141, and cause output of visual information on the touch screen. In addition, thecontroller180 can control themobile terminal100 to execute different operations or process different data according to whether a touch with respect to a point on the touch screen is either a proximity touch or a contact touch.
A touch sensor can sense a touch applied to the touch screen, such asdisplay unit151, using any of a variety of touch methods. Examples of such touch methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others.
As one example, the touch sensor may be configured to convert changes of pressure applied to a specific part of thedisplay unit151, or convert capacitance occurring at a specific part of thedisplay unit151, into electric input signals. The touch sensor may also be configured to sense not only a touched position and a touched area, but also touch pressure and/or touch capacitance. A touch object is generally used to apply a touch input to the touch sensor. Examples of typical touch objects include a finger, a touch pen, a stylus pen, a pointer, or the like.
When a touch input is sensed by a touch sensor, corresponding signals may be transmitted to a touch controller. The touch controller may process the received signals, and then transmit corresponding data to thecontroller180. Accordingly, thecontroller180 may sense which region of thedisplay unit151 has been touched. Here, the touch controller may be a component separate from thecontroller180, thecontroller180, and combinations thereof.
In some embodiments, thecontroller180 may execute the same or different controls according to a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. Whether to execute the same or different control according to the object which provides a touch input may be decided based on a current operating state of themobile terminal100 or a currently executed application program, for example.
The touch sensor and the proximity sensor may be implemented individually, or in combination, to sense various types of touches. Such touches includes a short (or tap) touch, a long touch, a multi-touch, a drag touch, a flick touch, a pinch-in touch, a pinch-out touch, a swipe touch, a hovering touch, and the like.
If desired, an ultrasonic sensor may be implemented to recognize position information relating to a touch object using ultrasonic waves. Thecontroller180, for example, may calculate a position of a wave generation source based on information sensed by an illumination sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, the time for which the light reaches the optical sensor is much shorter than the time for which the ultrasonic wave reaches the ultrasonic sensor. The position of the wave generation source may be calculated using this fact. For instance, the position of the wave generation source may be calculated using the time difference from the time that the ultrasonic wave reaches the sensor based on the light as a reference signal.
Thecamera121 typically includes at least one a camera sensor (CCD, CMOS etc.), a photo sensor (or image sensors), and a laser sensor.
Implementing thecamera121 with a laser sensor may allow detection of a touch of a physical object with respect to a 3D stereoscopic image. The photo sensor may be laminated on, or overlapped with, the display device. The photo sensor may be configured to scan movement of the physical object in proximity to the touch screen. In more detail, the photo sensor may include photo diodes and transistors at rows and columns to scan content received at the photo sensor using an electrical signal which changes according to the quantity of applied light. Namely, the photo sensor may calculate the coordinates of the physical object according to variation of light to thus obtain position information of the physical object.
Thedisplay unit151 is generally configured to output information processed in themobile terminal100. For example, thedisplay unit151 may display execution screen information of an application program executing at themobile terminal100 or user interface (UI) and graphic user interface (GUI) information in response to the execution screen information.
In some embodiments, thedisplay unit151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may employ a stereoscopic display scheme such as a stereoscopic scheme (a glass scheme), an auto-stereoscopic scheme (glassless scheme), a projection scheme (holographic scheme), or the like.
Theaudio output module152 is generally configured to output audio data. Such audio data may be obtained from any of a number of different sources, such that the audio data may be received from thewireless communication unit110 or may have been stored in thememory170. The audio data may be output during modes such as a signal reception mode, a call mode, a record mode, a voice recognition mode, a broadcast reception mode, and the like. Theaudio output module152 can provide audible output related to a particular function (e.g., a call signal reception sound, a message reception sound, etc.) performed by themobile terminal100. Theaudio output module152 may also be implemented as a receiver, a speaker, a buzzer, or the like.
Ahaptic module153 can be configured to generate various tactile effects that a user feels, perceive, or otherwise experience. A typical example of a tactile effect generated by thehaptic module153 is vibration. The strength, pattern and the like of the vibration generated by thehaptic module153 can be controlled by user selection or setting by the controller. For example, thehaptic module153 may output different vibrations in a combining manner or a sequential manner.
Besides vibration, thehaptic module153 can generate various other tactile effects, including an effect by stimulation such as a pin arrangement vertically moving to contact skin, a spray force or suction force of air through a jet orifice or a suction opening, a touch to the skin, a contact of an electrode, electrostatic force, an effect by reproducing the sense of cold and warmth using an element that can absorb or generate heat, and the like.
Thehaptic module153 can also be implemented to allow the user to feel a tactile effect through a muscle sensation such as the user's fingers or arm, as well as transferring the tactile effect through direct contact. Two or morehaptic modules153 may be provided according to the particular configuration of themobile terminal100.
Anoptical output module154 can output a signal for indicating an event generation using light of a light source. Examples of events generated in themobile terminal100 may include message reception, call signal reception, a missed call, an alarm, a schedule notice, an email reception, information reception through an application, and the like.
A signal output by theoptical output module154 may be implemented in such a manner that the mobile terminal emits monochromatic light or light with a plurality of colors. The signal output may be terminated as the mobile terminal senses that a user has checked the generated event, for example.
Theinterface unit160 serves as an interface for external devices to be connected with themobile terminal100. For example, theinterface unit160 can receive data transmitted from an external device, receive power to transfer to elements and components within themobile terminal100, or transmit internal data of themobile terminal100 to such external device. Theinterface unit160 may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input/output (I/O) ports, video I/O ports, earphone ports, or the like.
The identification module may be a chip that stores various information for authenticating authority of using themobile terminal100 and may include a user identity module (UIM), a subscriber identity module (SIM), a universal subscriber identity module (USIM), and the like. In addition, the device having the identification module (also referred to herein as an “identifying device”) may take the form of a smart card. Accordingly, the identifying device can be connected with the terminal100 via theinterface unit160.
When themobile terminal100 is connected with an external cradle, theinterface unit160 can serve as a passage to allow power from the cradle to be supplied to themobile terminal100 or may serve as a passage to allow various command signals input by the user from the cradle to be transferred to the mobile terminal there through. Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal is properly mounted on the cradle.
Thememory170 can store programs to support operations of thecontroller180 and store input/output data (for example, phonebook, messages, still images, videos, etc.). Thememory170 may store data related to various patterns of vibrations and audio which are output in response to touch inputs on the touch screen.
Thememory170 may include one or more types of storage mediums including a Flash memory, a hard disk, a solid state disk, a silicon disk, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read-Only Memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Themobile terminal100 may also be operated in relation to a network storage device that performs the storage function of thememory170 over a network, such as the Internet.
Thecontroller180 may typically control the general operations of themobile terminal100. For example, thecontroller180 may set or release a lock state for restricting a user from inputting a control command with respect to applications when a status of the mobile terminal meets a preset condition.
Thecontroller180 can also perform the controlling and processing associated with voice calls, data communications, video calls, and the like, or perform pattern recognition processing to recognize a handwriting input or a picture drawing input performed on the touch screen as characters or images, respectively. In addition, thecontroller180 can control one or a combination of those components in order to implement various exemplary embodiments disclosed herein.
Thepower supply unit190 receives external power or provide internal power and supply the appropriate power required for operating respective elements and components included in themobile terminal100. Thepower supply unit190 may include a battery, which is typically rechargeable or be detachably coupled to the terminal body for charging.
Thepower supply unit190 may include a connection port. The connection port may be configured as one example of theinterface unit160 to which an external charger for supplying power to recharge the battery is electrically connected.
As another example, thepower supply unit190 may be configured to recharge the battery in a wireless manner without use of the connection port. In this example, thepower supply unit190 can receive power, transferred from an external wireless power transmitter, using at least one of an inductive coupling method which is based on magnetic induction or a magnetic resonance coupling method which is based on electromagnetic resonance.
Various embodiments described herein may be implemented in a computer-readable medium, a machine-readable medium, or similar medium using, for example, software, hardware, or any combination thereof.
Referring now toFIGS. 1B and 1C, themobile terminal100 is described with reference to a bar-type terminal body. However, themobile terminal100 may alternatively be implemented in any of a variety of different configurations. Examples of such configurations include watch-type, clip-type, glasses-type, or as a folder-type, flip-type, slide-type, swing-type, and swivel-type in which two and more bodies are combined with each other in a relatively movable manner, and combinations thereof. Discussion herein will often relate to a particular type of mobile terminal (for example, bar-type, watch-type, glasses-type, and the like). However, such teachings with regard to a particular type of mobile terminal will generally apply to other types of mobile terminals as well.
Themobile terminal100 will generally include a case (for example, frame, housing, cover, and the like) forming the appearance of the terminal. In this embodiment, the case is formed using afront case101 and arear case102. Various electronic components are incorporated into a space formed between thefront case101 and therear case102. At least one middle case may be additionally positioned between thefront case101 and therear case102.
Thedisplay unit151 is shown located on the front side of the terminal body to output information. As illustrated, awindow151aof thedisplay unit151 may be mounted to thefront case101 to form the front surface of the terminal body together with thefront case101.
In some embodiments, electronic components may also be mounted to therear case102. Examples of such electronic components include adetachable battery191, an identification module, a memory card, and the like.Rear cover103 is shown covering the electronic components, and this cover may be detachably coupled to therear case102. Therefore, when therear cover103 is detached from therear case102, the electronic components mounted to therear case102 are externally exposed.
As illustrated, when therear cover103 is coupled to therear case102, a side surface of therear case102 is partially exposed. In some cases, upon the coupling, therear case102 may also be completely shielded by therear cover103. In some embodiments, therear cover103 may include an opening for externally exposing acamera121bor anaudio output module152b.
Thecases101,102,103 may be formed by injection-molding synthetic resin or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like.
As an alternative to the example in which the plurality of cases form an inner space for accommodating components, themobile terminal100 may be configured such that one case forms the inner space. In this example, amobile terminal100 having a uni-body is formed in such a manner that synthetic resin or metal extends from a side surface to a rear surface.
If desired, themobile terminal100 may include a waterproofing unit (not shown) for preventing introduction of water into the terminal body. For example, the waterproofing unit may include a waterproofing member which is located between thewindow151aand thefront case101, between thefront case101 and therear case102, or between therear case102 and therear cover103, to hermetically seal an inner space when those cases are coupled.
FIGS. 1B and 1C depict certain components as arranged on the mobile terminal. However, it is to be understood that alternative arrangements are possible and within the teachings of the instant disclosure. Some components may be omitted or rearranged. For example, thefirst manipulation unit123amay be located on another surface of the terminal body, and the secondaudio output module152bmay be located on the side surface of the terminal body.
Thedisplay unit151 outputs information processed in themobile terminal100. Thedisplay unit151 may be implemented using one or more suitable display devices. Examples of such suitable display devices include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an organic light emitting diode (OLED), a flexible display, a 3-dimensional (3D) display, an e-ink display, and combinations thereof.
Thedisplay unit151 may be implemented using two display devices, which can implement the same or different display technology. For instance, a plurality of thedisplay units151 may be arranged on one side, either spaced apart from each other, or these devices may be integrated, or these devices may be arranged on different surfaces.
Thedisplay unit151 may also include a touch sensor which senses a touch input received at the display unit. When a touch is input to thedisplay unit151, the touch sensor may be configured to sense this touch and thecontroller180, for example, may generate a control command or other signal corresponding to the touch. The content which is input in the touching manner may be a text or numerical value, or a menu item which can be indicated or designated in various modes.
The touch sensor may be configured in a form of a film having a touch pattern, disposed between thewindow151aand a display on a rear surface of thewindow151a, or a metal wire which is patterned directly on the rear surface of thewindow151a. Alternatively, the touch sensor may be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or within the display.
Thedisplay unit151 may also form a touch screen together with the touch sensor. Here, the touch screen may serve as the user input unit123 (seeFIG. 1A). Therefore, the touch screen may replace at least some of the functions of thefirst manipulation unit123a.
The firstaudio output module152amay be implemented in the form of a speaker to output voice audio, alarm sounds, multimedia audio reproduction, and the like.
Thewindow151aof thedisplay unit151 will typically include an aperture to permit audio generated by the firstaudio output module152ato pass. One alternative is to allow audio to be released along an assembly gap between the structural bodies (for example, a gap between thewindow151aand the front case101). In this case, a hole independently formed to output audio sounds may not be seen or is otherwise hidden in terms of appearance, thereby further simplifying the appearance and manufacturing of themobile terminal100.
Theoptical output module154 can be configured to output light for indicating an event generation. Examples of such events include a message reception, a call signal reception, a missed call, an alarm, a schedule notice, an email reception, information reception through an application, and the like. When a user has checked a generated event, the controller can control theoptical output unit154 to stop the light output.
Thefirst camera121acan process image frames such as still or moving images obtained by the image sensor in a capture mode or a video call mode. The processed image frames can then be displayed on thedisplay unit151 or stored in thememory170.
The first andsecond manipulation units123aand123bare examples of theuser input unit123, which may be manipulated by a user to provide input to themobile terminal100. The first andsecond manipulation units123aand123bmay also be commonly referred to as a manipulating portion, and may employ any tactile method that allows the user to perform manipulation such as touch, push, scroll, or the like. The first andsecond manipulation units123aand123bmay also employ any non-tactile method that allows the user to perform manipulation such as proximity touch, hovering, or the like.
FIG. 1B illustrates thefirst manipulation unit123aas a touch key, but possible alternatives include a mechanical key, a push key, a touch key, and combinations thereof.
Input received at the first andsecond manipulation units123aand123bmay be used in various ways. For example, thefirst manipulation unit123amay be used by the user to provide an input to a menu, home key, cancel, search, or the like, and thesecond manipulation unit123bmay be used by the user to provide an input to control a volume level being output from the first or secondaudio output modules152aor152b, to switch to a touch recognition mode of thedisplay unit151, or the like.
As another example of theuser input unit123, a rear input unit (not shown) may be located on the rear surface of the terminal body. The rear input unit can be manipulated by a user to provide input to themobile terminal100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide an input for power on/off, start, end, scroll, control volume level being output from the first or secondaudio output modules152aor152b, switch to a touch recognition mode of thedisplay unit151, and the like. The rear input unit may be configured to permit touch input, a push input, or combinations thereof.
The rear input unit may be located to overlap thedisplay unit151 of the front side in a thickness direction of the terminal body. As one example, the rear input unit may be located on an upper end portion of the rear side of the terminal body such that a user can easily manipulate it using a forefinger when the user grabs the terminal body with one hand. Alternatively, the rear input unit can be positioned at most any location of the rear side of the terminal body.
Embodiments that include the rear input unit may implement some or all of the functionality of thefirst manipulation unit123ain the rear input unit. As such, in situations where thefirst manipulation unit123ais omitted from the front side, thedisplay unit151 can have a larger screen.
As a further alternative, themobile terminal100 may include a finger scan sensor which scans a user's fingerprint. Thecontroller180 can then use fingerprint information sensed by the finger scan sensor as part of an authentication procedure. The finger scan sensor may also be installed in thedisplay unit151 or implemented in theuser input unit123.
Themicrophone122 is shown located at an end of themobile terminal100, but other locations are possible. If desired, multiple microphones may be implemented, with such an arrangement permitting the receiving of stereo sounds.
Theinterface unit160 may serve as a path allowing themobile terminal100 to interface with external devices. For example, theinterface unit160 may include one or more of a connection terminal for connecting to another device (for example, an earphone, an external speaker, or the like), a port for near field communication (for example, an Infrared Data Association (IrDA) port, a Bluetooth port, a wireless LAN port, and the like), or a power supply terminal for supplying power to themobile terminal100. Theinterface unit160 may be implemented in the form of a socket for accommodating an external card, such as Subscriber Identification Module (SIM), User Identity Module (UIM), or a memory card for information storage.
Thesecond camera121bis shown located at the rear side of the terminal body and includes an image capturing direction that is substantially opposite to the image capturing direction of thefirst camera unit121a. If desired,second camera121amay alternatively be located at other locations, or made to be moveable, in order to have a different image capturing direction from that which is shown.
Thesecond camera121bcan include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. The cameras may be referred to as an “array camera.” When thesecond camera121bis implemented as an array camera, images may be captured in various manners using the plurality of lenses and images with better qualities.
As shown inFIG. 1C, aflash124 is shown adjacent to thesecond camera121b. When an image of a subject is captured with thecamera121b, theflash124 may illuminate the subject.
As shown inFIG. 1B, the secondaudio output module152bcan be located on the terminal body. The secondaudio output module152bmay implement stereophonic sound functions in conjunction with the firstaudio output module152a, and may be also used for implementing a speaker phone mode for call communication.
At least one antenna for wireless communication may be located on the terminal body. The antenna may be installed in the terminal body or formed by the case. For example, an antenna which configures a part of thebroadcast receiving module111 may be retractable into the terminal body. Alternatively, an antenna may be formed using a film attached to an inner surface of therear cover103, or a case that includes a conductive material.
Apower supply unit190 for supplying power to themobile terminal100 may include abattery191, which is mounted in the terminal body or detachably coupled to an outside of the terminal body. Thebattery191 may receive power via a power source cable connected to theinterface unit160. Also, thebattery191 can be recharged in a wireless manner using a wireless charger. Wireless charging may be implemented by magnetic induction or electromagnetic resonance.
Therear cover103 is shown coupled to therear case102 for shielding thebattery191, to prevent separation of thebattery191, and to protect thebattery191 from an external impact or from foreign material. When thebattery191 is detachable from the terminal body, therear case103 may be detachably coupled to therear case102.
An accessory for protecting an appearance or assisting or extending the functions of themobile terminal100 can also be provided on themobile terminal100. As one example of an accessory, a cover or pouch for covering or accommodating at least one surface of themobile terminal100 may be provided. The cover or pouch may cooperate with thedisplay unit151 to extend the function of themobile terminal100. Another example of the accessory is a touch pen for assisting or extending a touch input to a touch screen.
Hereinafter, for the sake of convenience of explanation, the description disclosed herein will be limited to CDMA. However, it is apparent that the present invention may be also applicable to all communication systems including a CDMA wireless communication system.
As illustrated inFIG. 2A, a CDMA wireless communication system may include a plurality ofterminals100, a plurality of base stations (BSs)270, a plurality of base station controllers (BSCs)275, and a mobile switching center (MSC)280. TheMSC280 may interface with a Public Switched Telephone Network (PSTN)290, and theMSC280 may also interface with theBSCs275. TheBSCs275 may be connected to theBSs270 via backhaul lines. The backhaul lines may be configured in accordance with at least any one of E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL, for example. Further, the system illustrated inFIG. 2A may include a plurality ofBSCs275.
Each of the plurality ofBSs270 may include at least one sector, each sector having an omni-directional antenna or an antenna indicating a particular radial direction from thebase station270. Alternatively, each sector may include two or more antennas with various forms. Each of theBSs270 may be configured to support a plurality of frequency assignments, each frequency assignment having a particular spectrum (for example, 1.25 MHz, 5 MHz, etc.).
The intersection of a sector and frequency assignment may be referred to as a CDMA channel. TheBSs270 may also be referred to as Base Station Transceiver Subsystems (BTSs). In this case, the term “base station” may collectively refer to aBSC275, and at least oneBS270. The base stations may also indicate “cell sites”. Alternatively, individual sectors for aspecific BS270 may also be referred to as a plurality of cell sites.
As illustrated inFIG. 2A, the Broadcasting Transmitter (BT)295 may transmit broadcasting signals to themobile terminals100 being operated within the system. Thebroadcast receiving module111 as illustrated inFIG. 1A may be provided in themobile terminal100 to receive broadcast signals transmitted by theBT295.
In addition,FIG. 2A illustrates several global positioning system (GPS)satellites300.Such satellites300 facilitate locating at least one of a plurality ofmobile terminals100. Though two satellites are illustrated inFIG. 2A, location information may be obtained with a greater or fewer number of satellites. Thelocation information module115 as illustrated inFIG. 1A may cooperate with thesatellites300 as illustrated inFIG. 2A to obtain desired location information. However, other types of position detection technology, all types of technologies capable of tracing the location may be used in addition to a GPS location technology. Furthermore, at least one of theGPS satellites300 may alternatively or additionally provide satellite DMB transmissions.
During the operation of a wireless communication system, theBS270 may receive reverse-link signals from variousmobile terminals100. At this time, themobile terminals100 may perform calls, message transmissions and receptions, and other communication operations. Each reverse-link signal received by aspecific base station270 may be processed within thatspecific base station270. The processed resultant data may be transmitted to an associatedBSC275. TheBSC275 may provide call resource allocation and mobility management functions including the systemization of soft handoffs between thebase stations270. Furthermore, theBSCs275 may also transmit the received data to theMSC280, which provides additional transmission services for interfacing with thePSTN290. Furthermore, similarly, thePSTN290 may interface with theMSC280, and theMSC280 may interface with theBSCs275. TheBSCs275 may also control theBSs270 to transmit forward-link signals to themobile terminals100.
Next, a method of acquiring the location information of a mobile terminal using a WiFi (Wireless Fidelity) positioning system (WPS) will be described with reference toFIG. 2B.
The WiFi positioning system (WPS)300 refers to a location determination technology based on a wireless local area network (WLAN) using WiFi as a technology for tracking the location of themobile terminal100 using a WiFi module provided in themobile terminal100 and awireless access point520 for transmitting and receiving to and from the WiFi module.
TheWiFi positioning system300 may include a WiFilocation determination server510, amobile terminal100, a wireless access point (AP)520 connected to themobile terminal100, and a database530 stored with any wireless AP information.
The WiFilocation determination server510 extracts the information of thewireless AP520 connected to themobile terminal100 based on a location information request message (or signal) of themobile terminal100. The information of thewireless AP520 may be transmitted to the WiFilocation determination server510 through themobile terminal100 or transmitted to the WiFilocation determination server510 from thewireless AP520.
The information of the wireless AP extracted based on the location information request message of themobile terminal100 may be at least one of MAC address, SSID, RSSI, channel information, privacy, network type, signal strength and noise strength.
The WiFilocation determination server510 receives the information of thewireless AP520 connected to themobile terminal100 as described above, and compares the receivedwireless AP520 information with information contained in the pre-established database530 to extract (or analyze) the location information of themobile terminal100.
On the other hand, referring toFIG. 2B, as an example, the wireless AP connected to themobile terminal100 is illustrated as a first, a second, and athird wireless AP520. However, the number of wireless APs connected to themobile terminal100 may be changed in various ways according to a wireless communication environment in which themobile terminal100 is located. When themobile terminal100 is connected to at least one of wireless APs, theWiFi positioning system300 can track the location of themobile terminal100.
Next, considering the database530 stored with any wireless AP information in more detail, various information of any wireless APs disposed at different locations may be stored in the database530.
The information of any wireless APs stored in the database530 may be information such as MAC address, SSID, RSSI, channel information, privacy, network type, latitude and longitude coordinate, building at which the wireless AP is located, floor number, detailed indoor location information (GPS coordinate available), AP owner's address, phone number, and the like.
In this manner, any wireless AP information and location information corresponding to the any wireless AP are stored together in the database530, and thus the WiFilocation determination server510 may retrieve wireless AP information corresponding to the information of thewireless AP520 connected to the mobile terminal100 from the database530 to extract the location information matched to the searched wireless AP, thereby extracting the location information of themobile terminal100.
Hereinafter, a communication system which is operable with themobile terminal100 according to the present disclosure will be described.
FIGS. 2A and 2B are conceptual views of a communication system operable with a mobile terminal in accordance with the present disclosure.
First, referring toFIG. 2A, such communication systems utilize different air interfaces and/or physical layers. Examples of such air interfaces utilized by the communication systems include Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS), the Long Term Evolution (LTE) of the UMTS, the Global System for Mobile Communications (GSM), and the like.
By way of non-limiting example only, further description will relate to a CDMA communication system, but such teachings apply equally to other system types including the CDMA wireless communication system.
Referring now toFIG. 2A, a CDMA wireless communication system is shown having a plurality ofmobile terminals100, a plurality of base stations (BSs)270, base station controllers (BSCs)275, and a mobile switching center (MSC)280. TheMSC280 is configured to interface with a conventional Public Switch Telephone Network (PSTN)290. TheMSC280 is also configured to interface with theBSCs275. TheBSCs275 are coupled to thebase stations270 via backhaul lines. The backhaul lines may be configured in accordance with any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. Hence, the plurality ofBSCs275 can be included in the system as shown inFIG. 2A.
Eachbase station270 may include one or more sectors, each sector having an omni-directional antenna or an antenna pointed in a particular direction radially away from thebase station270. Alternatively, each sector may include two or more different antennas. Eachbase station270 may be configured to support a plurality of frequency assignments, with each frequency assignment having a particular spectrum (e.g., 1.25 MHz, 5 MHz, etc.).
The intersection of sector and frequency assignment may be referred to as a CDMA channel. Thebase stations270 may also be referred to as Base Station Transceiver Subsystems (BTSs). In some cases, the term “base station” may be used to refer collectively to aBSC275, and one ormore base stations270. The base stations may also be denoted as “cell sites.” Alternatively, individual sectors of a givenbase station270 may be referred to as cell sites.
A broadcasting transmitter (BT)295, as shown inFIG. 2A, transmits a broadcast signal to themobile terminals100 operating within the system. The broadcast receiving module111 (FIG. 1A) is typically configured inside themobile terminal100 to receive broadcast signals transmitted by theBT295.
FIG. 2A further depicts several Global Positioning System (GPS)satellites300.Such satellites300 facilitate locating the position of at least one of pluralmobile terminals100. Two satellites are depicted inFIG. 2A, but it is understood that useful position information may be obtained with greater or fewer satellites than two satellites. The GPS module115 (FIG. 1A) is typically configured to cooperate with thesatellites300 to obtain desired position information. It is to be appreciated that other types of position detection technology, (i.e., location technology that may be used in addition to or instead of GPS location technology) may alternatively be implemented. If desired, at least one of theGPS satellites300 may alternatively or additionally be configured to provide satellite DMB transmissions.
During typical operation of the wireless communication system, thebase stations270 receive sets of reverse-link signals from variousmobile terminals100. Themobile terminals100 are engaging in calls, messaging, and executing other communications. Each reverse-link signal received by a givenbase station270 is processed within thatbase station270. The resulting data is forwarded to an associatedBSC275. TheBSC275 provides call resource allocation and mobility management functionality including the orchestration of soft handoffs betweenbase stations270. TheBSCs275 also route the received data to theMSC280, which then provides additional routing services for interfacing with thePSTN290. Similarly, thePSTN290 interfaces with theMSC280, and theMSC280 interfaces with theBSCs275, which in turn control thebase stations270 to transmit sets of forward-link signals to themobile terminals100.
Hereinafter, description will be given of a method for acquiring location information of a mobile terminal using a wireless fidelity (WiFi) positioning system (WPS), with reference toFIG. 2B.
The WiFi positioning system (WPS)300 refers to a location determination technology based on a wireless local area network (WLAN) using WiFi as a technology for tracking the location of themobile terminal100 using a WiFi module provided in themobile terminal100 and awireless access point320 for transmitting and receiving to and from the WiFi module.
TheWiFi positioning system300 may include a WiFilocation determination server310, amobile terminal100, a wireless access point (AP)320 connected to themobile terminal100, and adatabase330 stored with any wireless AP information.
The WiFilocation determination server310 extracts the information of thewireless AP320 connected to themobile terminal100 based on a location information request message (or signal) of themobile terminal100. The information of thewireless AP320 may be transmitted to the WiFilocation determination server310 through themobile terminal100 or transmitted to the WiFilocation determination server310 from thewireless AP320.
The information of the wireless AP extracted based on the location information request message of themobile terminal100 may be at least one of MAC address, SSID, RSSI, channel information, privacy, network type, signal strength and noise strength.
The WiFilocation determination server310 receives the information of thewireless AP320 connected to themobile terminal100 as described above, and compares the receivedwireless AP320 information with information contained in thepre-established database330 to extract (or analyze) the location information of themobile terminal100.
On the other hand, referring toFIG. 2B, as an example, the wireless AP connected to themobile terminal100 is illustrated as a first, a second, and athird wireless AP320. However, the number of wireless APs connected to themobile terminal100 may be changed in various ways according to a wireless communication environment in which themobile terminal100 is located. When themobile terminal100 is connected to at least one of wireless APs, theWiFi positioning system300 can track the location of themobile terminal100.
Next, considering thedatabase330 stored with any wireless AP information in more detail, various information of any wireless APs disposed at different locations may be stored in thedatabase330.
The information of any wireless APs stored in thedatabase330 may be information such as MAC address, SSID, RSSI, channel information, privacy, network type, latitude and longitude coordinate, building at which the wireless AP is located, floor number, detailed indoor location information (GPS coordinate available), AP owner's address, phone number, and the like.
In this manner, any wireless AP information and location information corresponding to the any wireless AP are stored together in thedatabase330, and thus the WiFilocation determination server310 may retrieve wireless AP information corresponding to the information of thewireless AP320 connected to the mobile terminal100 from thedatabase330 to extract the location information matched to the searched wireless AP, thereby extracting the location information of themobile terminal100.
Furthermore, the extracted location information of themobile terminal100 may be transmitted to themobile terminal100 through the WiFilocation determination server310, thereby acquiring the location information of themobile terminal100.
On the other hand, a mobile terminal according to the present disclosure may have various design forms. Hereinafter, a mobile terminal having a lateral display unit and a user interface using the same will be described as one of the structural changes and improvements.FIG. 3A is a front perspective view illustrating another example of a mobile terminal associated with the present disclosure, andFIG. 3B is a rear perspective view of a mobile terminal illustrated inFIG. 3A.
Amobile terminal200 disclosed herein has a portable phone body in a bar shape. However, the present disclosure may not be necessarily limited to this, and may be also applicable to various structures of terminals such as a slide type, a folder type, a swing type, a swivel type, and the like, in which two and more bodies are combined with each other in a relatively movable manner.
The body includes a case (casing, housing, cover, etc.) forming an appearance of the terminal. In this embodiment, the case may be divided into a front case201 and a rear case202. Various electronic components may be incorporated in a space formed between the front case201 and the rear case202. At least one middle case may be additionally disposed between the front case201 and the rear case202.
The cases may be formed by injection-molding a synthetic resin or may be also formed of a metal material such as stainless steel (STS), titanium (Ti), or the like.
Adisplay unit251, anaudio output module252, acamera module221, and the like may be disposed on the terminal body, mainly on the front case201.
Thedisplay unit251 occupies most of a main surface of thefront case101. In other words, the display unit is disposed on a front surface of the terminal, and formed to display visual information. Thedisplay unit251 according to the present disclosure may be formed on a front surface of the terminal as well as in a form extended to another surface of the terminal. More specifically, thedisplay unit251 may include afirst region261 disposed on the front surface, and asecond region262,263 extended from thefirst region261 and disposed on a lateral surface of the body. Here, the lateral surface may be a surface which is seen by the user when a mobile terminal is seen from the lateral surface (or side).
On the other hand, at least part of thesecond region262,263 may be disposed on a front surface thereof. For example, thesecond region262,263 may be formed over a lateral surface and a front surface of the terminal. Here, whether or not thesecond region262,263 is seen from the front surface is determined according to a structure in which the first and thesecond region261,262,263 are formed on thedisplay unit251.
For example, a window disposed on an upper surface of thedisplay unit251 is formed in a shape in which both lateral surfaces are bent down, and through this, the appearance of a front surface and a lateral surface of the body is formed by the window. Accordingly, thefirst region261 andsecond region262,263 may be connected to each other in a shape having no physical interface. In this case, thedisplay unit251 is formed in a bent shape, and may include display elements integrated to correspond to the window.
For another example, thedisplay unit251 may be a flexible display unit. The flexible display may include a flexible, bendable, twistable, foldable and rollable display. Here, the flexible display unit may include both typical flexible displays and electronic paper.
Here, typical flexible display may be a light and non-fragile rigid display fabricated on a thin and flexible substrate that can be warped, bent, folded or rolled like a paper sheet while maintaining the display characteristics of a flat display in the related art.
Furthermore, electronic paper as a display technology to which a typical characteristic of ink is applied may be different from that of a typical flat panel display in which reflective light is used. Electronic paper may change a drawing or text using twist balls or electrophoresis using capsules.
In this manner, it may be possible to configure a terminal body having a form in which both lateral surfaces of the display unit is warped by flexible material properties.
On the other hand, a form in which a flexible display unit is extended to a front surface and both lateral surfaces thereof has been described in the above, but the present disclosure may be also configured with a form in which three independent display units are disposed on a front surface and both lateral surfaces thereof. For example, afront display unit261 and alateral display unit262,263 are independent display units, respectively, and may be configured with a form in which they are disposed in an adjacent manner.
Theaudio output unit252 and acamera module221 may be disposed in a region adjacent to one of both end portions of thedisplay unit251, and a front input unit (not shown) and amicrophone222 may be disposed in a region adjacent to the other end portion.
The front input unit as an example of the user input unit230 (refer to FIG.1A) may include a plurality of manipulating units. The manipulating units may be commonly referred to as a manipulating portion, and any method may be employed if it is a tactile manner allowing the user to perform manipulation with a tactile feeling.
Furthermore, thedisplay unit251 may form a touch screen along with a touch sensor, and in this case, the touch screen may be a user input unit. Through this, it may be possible to have a configuration with no front input unit on a front surface of the terminal. In this case, a mobile terminal may be configured to enable input manipulation to the terminal body only through thedisplay unit251 and arear input unit232 which will be described later.
Referring toFIG. 3B, acamera module221′ may be additionally mounted on a rear surface of the terminal body, namely, the rear case202. Thecamera module221′ has an image capturing direction, which is substantially opposite to the direction of the camera221 (refer toFIG. 3A), and may have a different number of pixels from that of the camera module.
For example, it is preferable that thecamera module221 has a relatively small number of pixels enough not to cause a difficulty when the user captures his or her own face and immediately sends it to the other party during a video call or the like, and thecamera module221′ has a relatively large number of pixels since the user often captures a general object but does not sends it immediately. Thecamera modules221,221′ may be provided in the terminal body in a rotatable and popupable manner.
A flash and a mirror may be additionally disposed adjacent to thecamera module221′. The flash illuminates light toward an subject when capturing the subject with thecamera module221′. The mirror allows the user to look at his or her own face, or the like, in a reflected way when capturing himself or herself (in a self-portrait mode) by using thecamera module221′.
An audio output unit (not shown) may be additionally disposed on a rear surface of the terminal body. The audio output unit on a rear surface thereof together with the audio output unit252 (refer toFIG. 3A) on a front surface thereof can implement a stereo function, and may be also used to implement a speaker phone mode during a phone call.
In other words, a second audio output unit configured with a speaker on a rear surface of the terminal may be formed along with the audio output unit252 (first audio output unit) configured with a receiver on a front surface thereof.
Apower supply unit290 for supplying power to theportable terminal200 may be mounted on the terminal body. Thepower supply unit290 may be configured to be incorporated in the terminal body, or directly detachable from the outside of the terminal body.
According to the drawing, arear input unit232 may be disposed on a rear surface of the terminal body. Therear input unit232 may be located at a lower portion of thecamera module221′, for example.
Therear input unit232 may be manipulated to receive a command for controlling the operation of themobile terminal200, and the received content may be set in various ways. For example, it may be possible to receive a command such as power on/off, start, end, scroll or the like, or a command such as volume adjustment output from theaudio output unit252, switching to a touch recognition mode of thedisplay unit251, or the like. However, the present disclosure may not be necessarily limited to this, and the terminal may include only either one or both of the front input unit andrear input unit232.
On the other hand, as described above, the controller180 (refer toFIG. 1A) may control the function of the terminal using a display unit disposed on a lateral surface of the terminal.
Hereinafter, a method of controlling the function of the terminal using display units disposed on both lateral surfaces thereof will be described in more detail with reference to the accompanying drawings.
For the sake of convenience of explanation, thesecond region262 denotes a display region disposed at a left side of both lateral surfaces based on the front surface, and thethird region263 denotes a display region disposed at a right side of both lateral surfaces based on the front surface. Furthermore, thefirst region261 denotes a display (front display) disposed on the front surface.
Furthermore, on the contrary, though it is illustrated that thedisplay unit251 according to the present disclosure includes display regions at both the left and the right sides thereof, thedisplay unit251 may further include a lateral display unit only at either one of the left and the right sides thereof based on thefirst region261. However, according to the present disclosure, for the sake of convenience of explanation, thedisplay unit251 including display regions (or display units) at a front surface, a left and a right sides thereof, respectively, will be described as an example.
Moreover, according to the present disclosure, a front display unit (or first region) will be described using reference numeral “261 or151a”, and a lateral display unit (or second region) will be described using reference numeral “151b”, “262” or “263”. And, the lateral display unit is also called “touch sensing unit”.
Moreover, the content that can be applicable to both the front and lateral display unit will be described as a “display unit251 or151” instead of reference numerals indicating the front and lateral display units, regardless of distinguishing the front and lateral display unit.
Hereinafter, a control method of combining the front and lateral surfaces of the display unit to approach an application and execute the application or use it as a folder will be described.
FIG. 4 is a flow chart for explaining a control method according to an embodiment of the present disclosure, andFIGS. 5A(a) to5D(c) are a conceptual view for explaining the control method ofFIG. 4.
Thedisplay unit151adisplaysfirst screen information510. For example, thefirst screen information510 may correspond to an execution screen of an activated application. For example, thefirst screen information510 may correspond to an execution screen of a message application.
Themobile terminal100 may execute a plurality of applications, and activate one of the plurality of applications. Though not shown in the drawing, the controller may control the display unit to display visual information on the application being executed by a preset control command. The controller may control the display unit to sequentially display visual information in the activated sequence of the corresponding applications based on the control command.
Thetouch sensing unit151bis formed adjacent to both lateral surfaces of thedisplay unit151 along a length direction of thedisplay unit151. The user may apply a touch input to thetouch sensing unit151bformed by interposing thedisplay unit151 therebetween with his or her two fingers.
Though not shown in detail in the drawing, thetouch sensing unit151bmay be configured with a display device for displaying visual data, and may be formed to be incorporated with thedisplay unit151 as described above. Accordingly, preset visual information may be displayed on thetouch sensing unit151bconfigured with the display device. For example, the controller may control thetouch sensing unit151bto illuminate one region of thetouch sensing unit151bto which the user's touch is applied.
A consecutive touch input is applied to thetouch sensing unit151b(S502). Here, the touch inputs may be preferably applied to the bothtouch sensing units151bat the same time, but may not be necessarily limited to this. The consecutive touch input is an input scheme due to a finger moving along a length direction of the display unit. The touch range of the consecutive touch inputs is defined according to a length to which the touch input is applied as well as a location to which an initial touch is applied.
The controller controls thedisplay unit151 to display second screen information in one region of the display unit corresponding to the touch range of consecutive touch inputs (S503).
According to an embodiment illustrated inFIGS. 5A(a) to5A(d), the second screen information may correspond to an execution screen of another application. In other words, the controller controls thedisplay unit151 to switch part of first screen information which is the execution screen of the first application to second screen information which is an execution screen of a second application. Referring toFIGS. 5A(a) to5A(d), the second application may correspond to an execution screen of a gallery application for displaying a plurality of images.
The display unit displays thesecond screen information520 in a partial region of the display unit corresponding to a touch distance based on a touch range corresponding to the touch distance applied to thetouch sensing unit151b. In other words, the partial region corresponds to a region corresponding to a distance from an initial touch applied to thetouch sensing unit151bto a final movement position. Touch inputs applied to the bothtouch sensing units151bmay preferably move in substantially parallel to each other, but may not be necessarily limited to this.
Furthermore, the display unit displays thesecond screen information520 in a region corresponding to the touch range, and displays part of thefirst screen information510 in the remaining region. Thesecond screen information520 corresponds to part of an execution screen of the second application.
When the touch range corresponds to part of the length of the display unit, the controller activates the first and the second application at the same time, and controls the display unit to display part of the execution screen of the first and the second application in each divided region of the display unit.
Here, the second application corresponds to an application being executed along with the first application but in an inactive state. Furthermore, the second application corresponds to an application activated immediately prior to or subsequent to activating the first application.
Thetouch sensing unit151bsenses a touch input applied in both directions, namely, applied along a length direction or an opposite direction thereof of the display unit. For example, when a touch input consecutively applied from the top to the bottom, the controller may control the display unit to activate an application immediately prior to activating a currently activated application, and display an execution screen thereof.
Furthermore, the controller controls the display unit to display one region of each screen information along a direction in which the touch input is applied. For example, when the touch input moves in a downward direction based on the display unit, the display unit display an upper region of thefirst screen information510 at a lower end portion of the display unit, and a lower region of thesecond screen information520 at an upper end portion of the display unit.
The controller controls the display unit to limit the output of part of the first screen information and display second screen information in a larger region in real time as increasing the touch range from a moment at which an initial touch is applied to thetouch sensing unit151b.
Referring toFIGS. 5A(b) and5A(c), when the touch range is applied in a substantially similar manner to the length of a lateral surface of the display unit, the controller controls the display unit to display thesecond screen information520 as a whole on the display unit. For example, when the touch input is consecutively applied from the uppermost end to the lowermost end of a lateral surface of the display unit, the controller controls the display unit to switch from thefirst screen information510 to thesecond screen information520. Furthermore, the controller may control the display unit to increase a region displayed with thesecond screen information520 as decreasing a region displayed with thefirst screen information510.
Accordingly, the controller may switch the first application to an inactive state.
Referring toFIGS. 5A(c) and5A(d), thetouch sensing unit151bsenses a touch input applied in an opposite direction. In other words, when a touch input moving from the lowermost end of thetouch sensing unit151bto the uppermost end thereof by thetouch sensing unit151b, the controller controls the display unit to switch thesecond screen information520 to thefirst screen information510 again.
In other words, the controller controls the display unit to display the execution screen of a previous application again based on the touch input applied in an opposite direction.
The controller may sequentially record applications activated by the user, and sequentially display the execution screens of the applications based on the direction of a consecutive touch input applied to thetouch sensing unit151b. It is assumed that a first through a third application are being executed, and a first application through a third application have been sequentially activated, and an execution screen of the second application is currently being displayed on the display unit. The controller may control the display unit to display an execution screen of the first application based on a touch input applied to thetouch sensing unit151bin a downward direction, and display an execution screen of the third application based on a touch input applied to thetouch sensing unit151bin an upward direction.
Accordingly, the user may immediately execute an application that has been previously executed regardless of entering an additional control command to check information on an application currently being executed again.
A control method of each application in a state that first and second screen information are displayed will be described with reference to5B. The controller displays thefirst screen information510 in one region of the display unit based on a consecutive touch input applied to thetouch sensing unit151b, and displays thesecond screen information520 in the remaining region.
Referring toFIGS. 5B(c) and5B(d), the controller may apply a touch input to thesecond screen information520 to control the second application. For example, when the second application corresponds to a gallery application for displaying images stored in thememory160, and thesecond screen information520 includes a plurality of images in a reduced size.
In case where a plurality of images are displayed, if a touch input is applied to one image, then the controller controls the display unit to display only the image in an enlarged size.
According to the present embodiment, when a touch input is applied to one image contained in thesecond screen information520, the controller controls the display unit to display at leastpart520′ of the image in the remaining region of the display unit. In other words, the display unit controls the second application based on a touch input applied to thesecond screen information520, and displays a changed execution screen of the second application in the remaining region.
Though not shown in the drawing, the first application may be controlled by a touch input applied to thefirst screen information510. In other words, the controller may activate the first and the second application at the same time while the first and thesecond screen information510,520 are displayed at the same time.
Accordingly, the user may activate the first and the second application at the same time, and receive the resultant execution screens at the same time.
A control method of providing an icon of a recommended application based on a touch input applied to thetouch sensing unit151bwill be described with reference toFIG. 5C(a) to5C(c).
For example, the controller controls the display unit to display an execution screen of an application that has been just previously activated based on a touch input applied to thetouch sensing unit151bin a downward direction, and display an execution screen of an application that has been just subsequently (more recently) activated based on a touch input applied to thetouch sensing unit151bin an upward direction.
Referring toFIGS. 5C(a) and5C(b), when a touch input is applied to thetouch sensing unit151bin an upward direction, the controller display an execution screen of an application that has been recently activated rather than the current application. However, when the first application of thefirst screen information510 is a most recently activated application, the controller provides information on a recommended application.
In other words, the controller controls the display unit to display recommended icons corresponding to at least one application at a lower end portion of the display unit. The recommended application may be previously set by the user, or recommended as a result of analyzing thefirst screen information510.
For example, the recommended icon may include afirst icon531 previously set by the user. Thefirst icon531 may correspond to a graphic image for receiving a touch input to switch the screen to a home screen page.
The recommended icon may include asecond icon532 for an application expected to be used through the analysis of thefirst screen information510 and athird icon533 for an application associated with information contained in thefirst screen information510. For example, when thefirst screen information510 corresponding to an execution screen of a message application is displayed, thefirst screen information510 may include text such as “Show me a photo” in the message. The controller may expect that the user executes a gallery application to share images, and display asecond icon532 of the gallery application.
Furthermore, when a phone number is contained in the message, the controller may control the display unit to display athird icon533 of the phone application.
Furthermore, the controller analyzes the execution trend of each application using the recorded execution history of a user. For example, the controller may recommend an application having a high frequency of being executed along the message application. For example, when a user frequently executes an Internet connection application while activating the message application, the display unit may display afourth icon534 of the Internet connection application.
The recommended icons are displayed to occupy a preset region of the display unit. Referring toFIG. 5C(b), the recommended icons are displayed to cover part of thefirst screen information510.
On the other hand, referring toFIGS. 5C(a) and5C(c), the display unit may display the recommended icons to limit the display of an upper end portion of thefirst screen information510 and connect to a lower end portion of thefirst screen information510 by moving thefirst screen information510. In other words, the display unit may move thefirst screen information510 in an upward direction and display the icons in the remaining region thereof.
However, the location of displaying the icons may not be necessarily limited to this. Furthermore, when a touch input applied in a downward direction is applied to thetouch sensing unit151bin case where the display unit currently display an execution screen of an application executed at the very first time, the controller may control the display unit to display the recommended icons at an upper portion of the display unit.
Accordingly, even when there no longer exists an execution screen of application to be provided, the user may execute his or her desired application in a more convenient manner using the recommended application.
A control method of displaying two screen information at the same time will be described with reference toFIGS. 5D(a) to5D(c). Referring toFIG. 5D(a), when the consecutive touch input is suspended in one region of thetouch sensing unit151b, the controller displays part of the first and thesecond screen information510,520 at the same time on the display unit. A boundary between the first and thesecond screen information510,520 corresponds to the location of thetouch sensing unit151bto which the user's touch input is applied.
Referring toFIGS. 5D(a) and5D(b), when a touch input applied to thetouch sensing unit151bis released, the controller controls the display unit to display only one of the first and thesecond screen information510,520.
For example, when the touch input is released, the controller selects screen information displayed in the largest region of the display unit. Accordingly, the controller controls such that the display of thefirst screen information510 is limited, and thesecond screen information520 is displayed as a whole on the display unit.
On the other hand, referring toFIGS. 5D(a) and5D(c), when a preset type of touch input is applied to thetouch sensing unit151b, the controller may control the display unit to display the first and thesecond screen information510,520 at the same time.
For example, when a touch input (for example, long touch input in a suspended state) for a preset period of time is applied to and released from one region of thetouch sensing unit151bwhile displaying the first and thesecond screen information510,520 at the same time on the display unit, the controller may control the display unit to display the first and thesecond screen information510,520 at the same time. The display region and boundary of the first and thesecond screen information510,520 may correspond to one region of a touch sensing unit to which the long touch input is applied.
Accordingly, the user may receive at least one desired screen information in a more convenient manner.
FIGS. 6A(a) to6A(d) and6B(a) to6B(c) are conceptual views for explaining a control method of displaying additional information on one content selected from a plurality of contents contained in screen information.
FIG. 6A(a) illustrates a display unit for displaying afirst execution screen540 of a gallery application in which a plurality of images acquired through different views of angle or schemes are displayed at the same time. The controller senses one region of thetouch sensing unit151bto which the touch input is applied. Thefirst execution screen540 contains a plurality of contents, and the contents correspond to a plurality of images.
The controller selects at least one content displayed in one region of the display unit corresponding to one region of thetouch sensing unit151bon which the touch input is sensed. Here, one region of the display unit corresponding to one region of thetouch sensing unit151bis determined by two touch input positions applied in parallel to bothtouch sensing units151bformed at the both lateral surface thereof. Defining a virtual line connecting two touch input positions applied to the bothtouch sensing units151b, the virtual line passes one region of the display unit based on a touch input consecutively applied along one direction. The controller selects content displayed in one region of the display unit through which the virtual line passes. In other words, the controller controls the display unit to display content displayed in a row on the display unit corresponding to the touch range.
The controller controls the display unit to display firstadditional information550 associated with the selected content. The firstadditional information550 may be preferably displayed in a region adjacent to the selected content. Furthermore, the controller may control the display unit to display in such a manner that the firstadditional information550 covers one region of thefirst execution screen540. However, a region of the display unit displayed with the firstadditional information550 may not be necessarily limited to this. For example, the controller may control the display unit to move part of thefirst execution screen540 to another region to secure a region in which the firstadditional information550 is to be displayed.
The firstadditional information550 may correspond to a menu for controlling (or editing) the selected content. For example, the firstadditional information550 may correspond to a menu image for receiving a touch input to hide, edit or arrange the contents.
The controller may select a plurality of contents displayed in one row since the plurality of contents are selected by a touch input in a row unit. In this case, the controller may select only one content from the plurality of contents using a touch input applied to the display unit. For example, as illustrated in the drawing, three images are selected based on a touch input applied to thetouch sensing unit151b, and the display unit displays a menu image for the three images. Consequently, the controller selects one image based on a touch input applied to the display unit, and controls the selected one image based on an additional touch input applied to the menu image. As illustrated in the drawing, the controller controls the display unit to cover or (limit the display of) the selected onecontent541 based on a user's plurality of touch inputs.
The controller may control the display unit to limit the display of the firstadditional information550 based on a touch input applied along a direction opposite to thetouch sensing unit151b. In other words, the controller controls the display unit to select content using a touch input applied along the one direction and display the firstadditional information550 according to the selected content, and controls the display unit to allow the displayed firstadditional information550 to disappear by a touch input applied along an opposite direction.
For example, a consecutive touch input applied along the opposite direction may be defined as a touch input applied in such a manner that the virtual line passes through the firstadditional information550.
Referring toFIGS. 6A(c) and6A(d), the controller may control the display unit limit the display of the firstadditional information550 and the selected onecontent541 based on a touch input applied to thetouch sensing unit151b.
Furthermore, though not shown in the drawing, all contents contained in a row through which the virtual line passes may be selected based on the touch input. For example, when text formed with a plurality of rows is contained in the execution screen, the controller may select the plurality of rows, and perform editing on the text.
Accordingly, the user may select at least one content displayed on the display unit using a touch input to one region of the touch sensing unit, and more easily receive additional information thereon.
A control method of providing distinguished additional information on selected content will be described with reference toFIGS. 6B(a) to6B(c).
FIG. 6B(a) is a view illustrating a display unit displaying theexecution screen560 corresponding to a web browser screen. Theexecution screen560 includes different forms of contents. For example, theexecution screen560 may include an image, an input window, text, video, an image connected to a hyperlink, an icon and the like. For example, theexecution screen560 may includefirst content561 containing an image and text andsecond content562 with a numeric form.
Referring toFIGS. 6B(a) and6B(b), the controller selects thefirst content561 based on a touch input applied to thetouch sensing unit151b. A method of selecting the content is substantially the same as the method of6A, and thus the redundant description thereof will be omitted.
The controller may control the display unit to select thefirst content561 based on the touch input, and display the secondadditional information571 in a region adjacent to theexecution screen560. The secondadditional information571 corresponds to information associated with thefirst content561. For example, the secondadditional information571 may correspond to a menu image configured to receive a touch input for storing the image, storing the text or copying the image and text.
On the other hand, referring toFIGS. 6B(a) and6B(c), the controller selects thesecond content562 based on the touch input. The controller controls the display unit to display thirdadditional information572 associated with thesecond content562 in a region adjacent to the562 thesecond content562.
For example, the thirdadditional information572 may include a menu image for placing a call using the number, storing a numeral as a phone number or receiving a touch input to transmit text. The controller activates an application based on a touch input applied to one region of the menu image and performs a selected function using information contained in the content.
Accordingly, the user may select one of a plurality of contents, and receive additional information associated with the selected content.
FIGS. 7A(a) to7C(c) are conceptual views for explaining a control method of displaying lower information of the selected content.
FIG. 7A(a) is a view illustrating a display unit for displaying thefirst execution screen540 containing a plurality of image groups. Though not shown in the drawing, the controller may display at least one lower image contained in the image group on the display unit based on a touch input applied to the image group.
Referring toFIGS. 7A(a) and7A(b), the controller selects at least one image group displayed in a row of the image group contained in thefirst execution screen540 based on a touch input applied to thetouch sensing unit151b. The controller may control the display unit to display at least one image contained in the selected image group based on the touch input consecutively applied to thetouch sensing unit151b.
In other words, the controller may preferably control the display unit to display firstlower information551 of the selected image group, and display the firstlower information551 in a region adjacent to the selectedimage group541.
Furthermore, the controller may control the display unit to display the firstlower information551 in one region of the display unit through which a line defined by a consecutive touch input applied to thetouch sensing unit151bpasses. Furthermore, the display unit may display a larger number of the firstlower information551 as increasing a region of the applied touch input.
For example, when the image group selected by thetouch sensing unit151bcontains a plurality of images, and the display unit displays a plurality of images in a preset size, the controller controls the display unit to display the larger amount of images as increasing the touch range.
In other words, the user may check a plurality of images contained in the image group without applying a touch input to an image group displayed on the display unit. Furthermore, when a plurality of image groups are selected, the user may receive distinguished images contained in each image group at the same time.
Referring toFIGS. 7A(a) and7A(c), the controller controls the display unit to display the secondlower information552 of the selected image group. For example, the secondlower information552 may correspond to the generation information, setting information, size or the like of the image group.
A control method of providing third lower information contained in an application without activating the application in a home screen page will be described with reference toFIGS. 7B(a) to7B(b).
FIG. 7B(a) is a view illustrating a display unit for displayingthird screen information580 corresponding to a home screen page. Thethird screen information580 may include at least one icon for receiving a touch input to execute at least one application.
The display unit may further display a notification badge on the icon when an event occurs or is received at the application. For example, when a message is received at the mobile terminal, the display unit may further display a notification badge on theicon581 of the message application.
The controller may select at least one of the plurality of icons based on a touch input applied to thetouch sensing unit151bwhen the home screen page is displayed.
The controller may control the display unit to display additional information on an application corresponding to the selected icon. For example, the controller selects at least one icon displayed in one region of the display unit corresponding to a touch range of thetouch sensing unit151b. The controller may display thirdlower information590 on the application of an icon formed with the notification badge among the selected icons.
For example, when theicon581 of the message application is selected by the touch range, the controller may control the display unit to display the received message.
However, the present disclosure may not be necessarily limited to this, and the controller may control the display unit to display the additional information of an application corresponding to the selected icon when there is no event or notification corresponding to the selected icon. Here, the additional information may correspond to the description of the application, information associated with setting, used memory or the like.
Furthermore, the display unit may display the thirdlower information590 in one region of the display unit corresponding to the touch range based on the touch range.
Accordingly, the user may receive information on an event on a home screen page without activating an application containing the received event.
Though not shown in the drawing, the controller may control the display unit to limit the display of the thirdlower information590 when a touch input in an opposite direction is applied to thetouch sensing unit151bor a touch input is applied to one region of the display unit (a region in which the thirdlower information590 is not displayed).
Furthermore, when a touch input is applied to the thirdlower information590, the controller may control the display unit to activate the application, and display an execution screen of the application.
A notification image displayed on thetouch sensing unit151bwill be described with reference toFIGS. 7C(a) to7C(c). According to the present embodiment, thetouch sensing unit151bmay be implemented as a display device for displaying visual information. Accordingly, if necessary, the controller may control thetouch sensing unit151bto display an image. The present embodiment may not be necessarily limited to the control method illustrated in the drawing, and may be applicable to all the foregoing embodiments.
Referring toFIGS. 7C(a) to7C(c), when displayable lower information is contained in content, the controller controls such that thenotification image600 is displayed in one region of thetouch sensing unit151b. Thenotification image600 may be formed with a preset image or form of emitting light. Thenotification image600 may be preferably formed in a region adjacent to content containing additional information or lower information.
For example, when oneimage542 of the images displayed on the display unit containslower information553 in a state that a gallery application is activated, the controller controls thetouch sensing unit151bto display anotification image600 in a row displayed with theimage542.
Proactively describingFIGS. 7C(a) and7C(c), the controller controls the display unit to display theimage542 in an enlarged manner and display thelower information553 at the same time by a touch input applied to theimage542.
Referring toFIGS. 7C(a) and7C(b), when the touch range of thetouch sensing unit151bcontains a display region of thenotification image600, the controller controls the display unit to display thelower information553.
Furthermore, the controller may control thetouch sensing unit151bto move the display location of thenotification image600 based on a consecutive touch input applied to thetouch sensing unit151b. However, the present disclosure may not be necessarily limited to this, and thetouch sensing unit151bmay modify the shape of thenotification image600 or fix the location of thenotification image600 based on the touch input.
Though not shown in the drawing, when an application executed prior to or subsequent to the execution of the application exists while display an execution screen of one application, thenotification image600 may be displayed at an upper portion or lower portion of thetouch sensing unit151b.
Accordingly, the user may apply a touch input to thetouch sensing unit151bto check whether or not there is receivable information or content associated with receivable information in advance.
FIGS. 8A and 8C are a conceptual view for explaining a control method according to a touch scheme of a touch input applied to touch sensing units at both sides thereof.FIGS. 8A and 8C are a view illustrating a display unit for display the first screen information of the first application.
The controller may control the display unit based on a consecutive touch input applied to thetouch sensing unit151bon one lateral surface thereof in a state that touch inputs are applied to the touch sensing units on both lateral surfaces thereof at the same time.
Referring toFIGS. 8A and 8C, when a long touch input is applied to one region of the left touch sensing unit, and a consecutive touch input moving along one direction is applied to the right touch sensing unit, thesecond screen information520 of the second application is displayed along with thefirst screen information510.
Here, a region of the display unit on which thesecond screen information520 is displayed is based on the touch range of a touch input applied to the righttouch sensing unit151b. In other words, even in case of a consecutive touch input being applied only to one side of the touch sensing unit as well as consecutive touch inputs being applied to the left and right side thereof at the same time, it may be possible to activate another application.
On the other hand, referring toFIGS. 8A and 8C, when a long touch input is applied to the lefttouch sensing unit151b, and a consecutive touch input is applied to the right touch sensing unit, the controller activates a second application.
However, the controller controls the display unit to display the first and thesecond screen information510,520 using one region of the lefttouch sensing unit151bto which the long touch input is applied as a boundary. According to the present embodiment, the display information of the screen information cannot be controlled by the touch range of a consecutive touch input to the righttouch sensing unit151b.
Accordingly, even if both consecutive touch inputs are not applied to thetouch sensing units151bat both sides thereof, it may be possible to control the activation of an application and the display of screen information in a more convenient manner.
Hereinafter, a method of performing various function using both lateral surfaces of the display unit will be described in more detail with reference to the accompanying drawings.FIG. 9 is a flow chart illustrating a method of controlling both lateral surface of the display unit according to the present disclosure, andFIGS. 10A(a) to10D(b) are conceptual views illustrating the control method ofFIG. 9.
A mobile terminal according to the present disclosure may perform the process of displaying screen information in a first region disposed on a front surface of the body (S910).
A mobile terminal according to the present disclosure may further include adisplay unit251 on which afirst region261 is disposed on a front surface of the body, and asecond region262,263 connected to the first region and disposed on a lateral surface of the body.
Screen information associated with a function that can be driven in the mobile terminal may be displayed on thedisplay unit251. For example, the screen information may be the execution screen information of an application installed in the mobile terminal. Furthermore, a graphic object such as an icon, a widget or the like indicating an application installed in the mobile terminal may be displayed on thedisplay unit251.
On the other hand, screen information associated with a function that can be driven in the mobile terminal may be displayed even in a second region as well as in thefirst region261.
A mobile terminal according to the present disclosure may sense a preset type of touch being applied to a second region disposed on a lateral surface of the body (S920).
Thecontroller180 may control the first and the second region in an interlocking manner. For example, thecontroller180 may control screen information displayed in the first region using a touch applied to the second region. Here, controlling screen information may be understood as performing a function associated with the screen information. For example, thecontroller180 may perform the function of scrolling the web page according to a drag input applied to thesecond region262 in a state that screen information indicating a web page is displayed in thesecond region262.
To this end, a mobile terminal according to the present disclosure may further include asensing unit140 capable of sensing a touch applied to the first and the second region. Thesensing unit140 may be a touch sensor configured to sense a touch applied to thedisplay unit251.
On the other hand, a display unit disposed on a lateral surface of the body may include asecond region262 disposed on a left lateral surface thereof and athird region263 disposed on a right lateral surface thereof.
Thecontroller180 may sense a touch applied to at least part of the second and the third region using thesensing unit140. Here, thecontroller180 may determine whether or not a preset type of touch has been sensed in the second and the third region. The preset type of touch may use various touch input schemes such as a drag input, a multi touch or the like applied to at least part of the second and the third region. For example, thecontroller180 may sense a drag input applied to the second and the third region at the same time. Here, the direction of the drag input may be the same direction or different directions in the second and the third region.
When a preset type of touch is sensed on at least part of the second and the third region, a mobile terminal according to the present disclosure may perform the process of displaying information associated with screen information displayed in the first region in a direction different from the displayed direction of the screen information (S930).
A mobile terminal according to the present disclosure may display screen information having at least two or more display directions in the first region. In other words, the user may view at least two screen information having different directions at the same time.
To this end, in a state that screen information having a first display direction is displayed in the first region, thecontroller180 may display screen information having a second display direction which is different from the first display direction in at least part of the first region.
Here, the display direction of the screen information may denote a direction in which the screen information is displayed on thedisplay unit251. More specifically, the display direction of the screen information may be determined according to the posture of the mobile terminal body. For example, when the display direction of the screen information displayed in a posture in which a front surface of thedisplay unit251 is placed in a horizontal direction is defined as a first display direction, the display direction of the screen information displayed in a posture in which a front surface of thedisplay unit251 is placed in a vertical direction may be defined as a second display direction. In other words, the second display direction may be displayed with a difference of 90 degrees from the first display direction.
According to the present disclosure, the screen information having different directions may denote screen information displayed according to the posture of the body. For example, as illustrated inFIG. 10A(c),screen information1010 having a first display direction may denote screen information displayed when the body is placed in a horizontal direction, andscreen information1020 having a second display direction may be understood as screen information displayed when the body is place in a vertical direction.
Furthermore, the display direction of the screen information may be determined by a preset type of touch. For example, the display direction of the screen information may be determined according to the direction of a drag input applied to the second region.
Here, in a state that screen information having the first display direction is displayed, the second display direction may be determined by a direction in which a preset touch is applied. For example, thecontroller180 may determine a direction corresponding to the direction of a drag input applied to the second region as a second display direction.
Here, screen information having the first display direction and screen information having the second display direction may be screen information associated with each other. For example, when screen information having the first display direction is screen information indicating the outgoing and incoming record of SMS text messages, screen information having the second display direction may be screen information indicating a list of recipients capable of transmitting an SMS text messages.
For another example, screen information having the second display direction may be screen information to be displayed in the first region when screen information having the first display direction is displayed in the second display direction. More specifically, screen information displayed in the first region may be displayed any one of the first display direction and second display direction. In this case, in a state that screen information having the first display direction is displayed in the first region, thecontroller180 may display the same information in the second display direction at the same time.
Furthermore, screen information having the first display direction and screen information having the second display direction are screen information unrelated to each other. For example, screen information having the first display direction is screen information associated with the execution of a first application installed in the mobile terminal, screen information having the second display direction may be screen information associated with the execution of a second application which is different from the first application. Here, the second application may be any one of a plurality of applications being currently executed in the mobile terminal which is different from the first application.
On the other hand, according to the present disclosure, in addition to the first and the second display direction, screen information having various display directions such as a third and a fourth display direction may be displayed at the same time. Hereinafter, a case where two screen information having two directions are displayed at the same time will be described, but it may be also applicable to a case where a plurality of screen information having a plurality of display directions are displayed.
In a state that screen information having the first display direction is displayed, thecontroller180 may display screen information having the second display direction in at least part of the screen information having the first display direction.
More specifically, thecontroller180 may display screen information having the second display direction in a region corresponding to a region in which a preset type of touch is sensed. Here, screen information having the first display direction may maintain the display state as it is. In this case, screen information having the second display direction may be displayed to be overlapped with screen information having the first display direction.
Here, a region displayed with screen information having the second display direction may be determined based on a region in which the preset type of touch is sensed in the first region. For example, when the preset type of touch is a drag input applied to the second and thethird region262,263 at the same time, screen information having the second display direction may be displayed in a region having a length corresponding to a length over which the drag input is sensed.
Furthermore, when a preset type of touch is sensed, thecontroller180 may switch screen information having the first display direction into screen information having the second display direction. In this case, the user may switch the display direction of the screen information with no posture change of thedisplay unit251. Here, the posture of thedisplay unit251 may be either one of a posture in which thedisplay unit251 is placed in a vertical direction and a posture in which thedisplay unit251 is placed in a horizontal direction.
When screen information having the first and the second display direction are screen information relevant to each other in a state that the screen information having the first and the second display direction are displayed in thefirst region261 at the same time, an indicator indicating the relevance may be displayed on at least one of the screen information having the first and the second display direction.
More specifically, when at least part of the screen information having the first display direction and screen information having the second display direction are the same, thecontroller180 may display the same portion to be visually distinguished from the remaining screen information.
For example, when screen information having the first display direction is screen information indicating map information and screen information having the second display direction is screen information indicating more extended map information containing the map information, thecontroller180 may process a region indicating the same information as the screen information having the first display direction in the screen information having the second display direction in highlighted manner to visually distinguish it from the remaining screen information.
Furthermore, when screen information having the first display direction and screen information having the second display direction are screen information associated with each other, thecontroller180 may control the other screen information in response to a touch manipulation with respect to either one of the screen information having the first and the second direction. Here, the control of the screen information may denote performing various functions associated with screen information such as scrolling screen information, switching screen information or the like.
Hereinafter, displaying the screen information having different display directions in the first region will be described in more detail with reference to the accompanying drawings.
As illustrated inFIG. 10A(a), a mobile terminal according to the present disclosure may include adisplay unit251 having afirst region261 and asecond region262,263. Here, as illustrated inFIG. 10A(a), an execution screen associated with an application being executed in the mobile terminal may be displayed in thefirst region261. For example, the execution screen may bescreen information1010 associated with map information. Here, thescreen information1010 as well as map information may have a first display direction.
Here, thecontroller180 may sense a touch applied to thesecond region262,263. More specifically, when the second region is divided into afirst sub-region262 and asecond sub-region263, the controller may sense a touch applied to at least one of the first and thesecond sub-region262,263. For example, as illustrated inFIG. 10A(b), thecontroller180 may sense a drag input applied to the first and thesecond sub-region262,263 at the same time.
When the sensed touch is a preset type of touch, thecontroller180 may display information having a second display direction which is different from the first display direction in at least part of the first region. For example, as illustrated inFIG. 10A(c), thecontroller180 may display screen information in the form of being displayed when information displayed in the first display direction has a second display direction which is different from the first display direction in at least part of the first region. More specifically, when screen information having the first display direction is screen information on which map information is displayed in a horizontal direction based on the front surface of thedisplay unit251, screen information having the second display direction may bescreen information1020 on which the map information is displayed in a vertical direction based on the front surface of thedisplay unit251.
Through this, the user may receive screen information having various display directions with no posture change or additional manipulation of the mobile terminal body.
On the other hand, thecontroller180 may determine a region displayed with screen information having the second display direction based on a region in which a preset touch is sensed. For example, as illustrated inFIG. 10B(b), thecontroller180 may sense a drag input in thesecond region262,263. Here, thecontroller180 may determine a region in which screen information having the second display direction is displayed based on a length over which the drag input is sensed in thesecond region262,263 within thefirst region261.
For example, as illustrated inFIG. 10B(c), thecontroller180 may display screen information having the second display direction in a region having a length corresponding to a length over which the drag input is sensed in thesecond region262,263 within thefirst region261.
Through this, the user may adjust the length of a drag input applied to the second region to determine the size of a region in which screen information having a different display direction is to be displayed.
Furthermore, thecontroller180 may displayscreen information1020 having the second display direction in a region corresponding to a region to which a drag input is applied. More specifically, thecontroller180 may detect the start position and end position of a drag input applied to thesecond region262,263. For example, as illustrated inFIG. 10C(b), thecontroller180 may detect a region from a position corresponding to the start position of the drag input to a position corresponding to the end position thereof within the first region.
Here, as illustrated inFIG. 10C(c), thecontroller180 may displayscreen information1020 having the second display direction in the detected region.
Through this, the user may determine a region in which screen information having the second display direction is to be displayed.
Furthermore, thecontroller180 may no longer display screen information having the second display direction in response to a preset touch being sensed in a state that screen information having the first and the second display direction are displayed at the same time in thefirst region261.
For example, as illustrated inFIG. 10D(a), when a drag input is sensed in a preset direction, thecontroller180 may allowscreen information1020 having the second display direction to disappear from the first region. Here, only thescreen information1010 having the first display direction may be displayed in thefirst region261.
In the above, a method of displaying screen information having a first and a second display direction at the same time has been described. Through this, the user may view screen information displayed in a plurality of directions with no additional manipulation such as a posture change of the display unit.
Hereinafter, a method of displaying a relationship between the screen information associated with each other when screen information having the first and the second display direction are screen information associated with each other will be described.FIGS. 11A(a) to11C(c) are conceptual views for explaining a relationship between screen information associated with each other.
A mobile terminal according to the present disclosure may display screen information having a first and a second display direction at the same time. Here, the screen information having the first and the second display direction may be screen information associated with each other. For example, when the first screen information is a screen information indicating part of a document, the second screen information may be screen information indicating another part of the document containing the document content of the first screen information. In other words, the screen information having the first and the second display direction may be screen information on which part of the same document has a different display direction.
On the other hand, when screen information having the first and the second display direction are associated with each other, thecontroller180 may visually display a relationship with the relevant screen information. More specifically, thecontroller180 may display screen information having the first and the second display direction in such a manner that a region displayed with the same information is distinguished from another region.
For example, as illustrated inFIG. 11A(a), when a preset type of touch is sensed in thesecond region262,263 in a state that screeninformation1110 having a first display direction is displayed, thecontroller180 may displayscreen information1120 having a second display direction which is different from the first display direction in at least part of the first region.
Here, thecontroller180 may detect a region in which the same information is displayed in the screen information having the first display direction and screen information having the second display direction. For example, when screen information having the first display direction is information indicating part of a document, thecontroller180 may detect a region displayed with the same information as information indicating the screen information having the first display direction in the screen information having the second display direction.
When a region displayed with the same information is detected, thecontroller180 may display a region displayed with the same information to be visually distinguished from another region in at least one of the screen information having the first and the second display direction.
Here, thecontroller180 may process either one of the screen information having the first and the second display direction in a visually distinguished manner, and process both the screen information having the first and the second display direction in a visually distinguished manner.
Furthermore, thecontroller180 may select the screen information in a visually distinguished manner based on a preset condition. Here, the preset condition may be a condition containing a larger amount of screen information between the screen information having the first and the second display direction. In other words, containing a larger amount of screen information may denote that either one thereof contains all the screen information displayed in the other one thereof.
For example, as illustrated inFIG. 11A(b), in a state that thescreen information1120 having the second display direction is overlapped with thescreen information1110 having the first display direction, thecontroller180 may process aregion1130 displayed with the same information as thescreen information1110 having the first display direction within thescreen information1120 having the second display direction in a highlighted manner. The highlight processing may be expressed using the contrast, color, graphic object or the like of the region.
Furthermore, thecontroller180 may process a region displayed with the same information as the screen information having the second display direction within the screen information having the first display direction in a highlighted manner. For example, as illustrated inFIG. 11B(b), when screen information having a first display direction is displayed on screen information having a second display direction, thecontroller180 may process aregion1130 displayed with the same information within thescreen information1120 having the second display direction in a highlighted manner.
In other words, thecontroller180 may detect screen information having a larger amount of screen information while containing a region displayed with the same information within screen information having a first and a second display direction, and process the region displayed with the same information within the detected screen information in a highlighted manner.
Furthermore, thecontroller180 may control another one using a touch to either one of the screen information having the first and the second display region. For example, as illustrated inFIG. 11C(b), aregion1130 currently corresponding to the first display region may be processed in a highlighted manner in the second display region. Here, thecontroller180 may sense a touch being applied to any one region within a region displayed withscreen information1120 having a second display direction.
When the touch is sensed, as illustrated inFIG. 11C(c), thecontroller180 may scroll screen information having the first display direction such that information displayed in a region displayed with a position at which the touch is sensed is displayed on screen information having the first display direction. In this case, theregion1130 corresponding to a region in which the touch is sensed may be processed in a highlighted manner.
In the above, a method of controlling another one using a touch to either one of screen information having the first and the second display direction in a state that the screen information having the first and the second display direction is displayed in the first region has been described. Through this, the user may use screen information having a first and a second display direction in an organic manner.
Hereinafter, a method of displaying screen information having a second display direction in a state screen information having a first display direction is displayed will be described.FIGS. 12A(a) to12B(b) are conceptual views for explaining an embodiment of displaying screen information having a second display direction.
Thecontroller180 may display screen information having a display direction which is different from the display direction of the screen information displayed in the first region in at least part of screen information displayed in the first region based on a preset touch being applied thereto.
Here, the preset type of touch as a touch applied to thedisplay unit251 may be determined by at least one of a region in which the touch is applied, a time for which the touch is applied, a method with which the touch is applied, an area in which the touch is applied, and a length over which the touch is applied.
For example, the preset type of touch may be a drag input applied to thesecond region262,263 at the same time. Furthermore, the preset type of touch may be also determined by the direction of the drag input.
For another example, as illustrated inFIG. 12A(a), when the second region contains afirst sub-region262 and asecond sub-region263, the preset type of touch may be a touch sensed in a region having an area larger than a preset area applied to either one of the first and thesecond sub-region262,263.
For still another example, as illustrated inFIG. 12B(a), the preset type of touch may be touches applied to the first and the second sub-region at the same time. Here, the directions of the touches applied to the first and the second sub-region may have the same or different directions. For example, as illustrated inFIG. 12B(a), touches applied to the first and the second sub-region may be applied in different directions.
As illustrated inFIGS. 12A(b) and12B(b), when the preset type of touch is sensed, thecontroller180 may display the screen information having different directions in at least part of thefirst region261.
In the above example, screen information having different display directions have been displayed by a touch applied to thesecond region262,263. Through this, the user may view at least two screen information according to at least two directions at once with a simple manipulation. On the other hand, the present disclosure may not be necessarily limited to this, and may display screen information having different directions by various control commands. For example, various schemes may be used such as a preset type of touch applied to the first region, a button input, a voice input, and the like.
Hereinafter, a method of determining at least one of at least some of the display locations of the screen information having the first and the second display direction in a state that the screen information having the first and the second display direction is displayed in the first region.FIGS. 13A(a) to13B(b) are conceptual views illustrating a method of determining at least one of at least some of the display locations of screen information having the first and the second display direction.
Thecontroller180 may change at least one of the display locations of the screen information having the first and the second display direction based on a preset type of touch applied to thesecond region262,263 in a state that the screen information having the first display direction and screen information having the second display direction are displayed in the first region at the same time.
For example, as illustrated inFIG. 13A(a), thecontroller180 may sense touches applied to thesecond region262,263 at the same time in different directions. In this case, as illustrated inFIG. 13A(b), thecontroller180 may change a location at which thescreen information1010 having the first display direction is displayed and a location at which thescreen information1020 having the second display direction is displayed in response to the touch. In other words, the user may view the display locations of thescreen information1010 having the first display direction andscreen information1020 having the second display direction, which are displayed in a changed manner.
For another example, as illustrated inFIG. 13B(a), thecontroller180 may change the display location of thescreen information1020 having the second display direction while not changing the display location of thescreen information1010 having the first display direction.
Here, as illustrated inFIG. 13B(b), thecontroller180 may rotate and display thescreen information1020 having the second display direction according to the extent of drag inputs applied to thesecond region262,263 in different directions. In this case, thescreen information1020 having the second display direction may be displayed to be overlapped with thescreen information1010 having the first display direction.
In the above, a method of changing at least one of the display locations of the screen information having the first and the second display direction when a preset type of touch is applied to the second region in a state that the screen information having the first and the second display direction are displayed has been described. Through this, the user may receive screen information having different display directions at various display locations.
Hereinafter, when the screen information having the second display direction is displayed based on a preset type of touch being applied to the second region in a state that the screen information having the first display direction is displayed in the first region, the type of screen information having the second display direction will be described in more detail with reference to the accompanying drawings.FIGS. 14A(a) to14D(b) are conceptual views illustrating a type of screen information having a second display direction.
When a preset type of touch is sensed in a state that screen information having a first display direction is displayed in thefirst region261, thecontroller180 may display screen information having a second display direction to be overlapped with at least part of the first region.
Here, the screen information having the second display direction may have been previously set or may be set by the user. More specifically, the screen information having the second display direction may previously correspond to the screen information having the first display direction, respectively. Furthermore, the user may directly set the screen information having the second display direction for each of the screen information having the first display direction.
The screen information having the second display direction may be screen information associated with the screen information having the first display direction. For example, as illustrated inFIG. 14A(a), the screen information having the first display direction may bescreen information1400aassociated with the transmission and reception of messages. Here, as illustrated inFIG. 14A(b), the screen information having the second display direction may be screen information1400bindicating an identification information list of external terminals capable of transmitting and receiving messages.
For another example, as illustrated inFIG. 14B(a), the screen information having the first display direction may bescreen information1410aassociated with schedule management. Here, as illustrated inFIG. 14B(b), the screen information having the second display direction may bescreen information1410bscreen information indicating a to-do list associated with the schedule.
For still another example, as illustrated inFIG. 14C(a), the screen information having the first display direction may be any oneimage1420aamong a plurality of images stored in thememory unit170 of the mobile terminal. Here, as illustrated inFIG. 14C(b), the screen information having the second display direction may bescreen information1420bindicating at least some of a plurality of images stored in thememory unit170 of the mobile terminal.
For yet still another example, as illustrated inFIG. 14D(a), the screen information having the first display direction may be any oneslide1430aamong PPT slides. Here, as illustrated inFIG. 14D(b), the screen information having the second display direction may bescreen information1430bindicating the PPT list.
In other words, the screen information having the second display direction as information associated with the screen information having the first display direction may be screen information providing the user that being displayed in a second display direction is more convenient than being displayed in a first display direction.
In the above, the type of screen information having the second display direction when the screen information having the first and the second display direction are displayed at the same time has been described.
Hereinafter, a method of displaying images received from cameras different capture directions in a mobile terminal having the cameras having different capture directions will be described.FIGS. 15A(a) to15B(c) are conceptual views illustrating a method of displaying images received from cameras having different capture directions.
A mobile terminal according to the present disclosure may further include acamera unit121. Here, thecamera unit121 may be configured to change its capture direction. Furthermore, the present disclosure may include at least twocamera units121a,121bhaving different capture directions. Hereinafter, the term ofcamera unit121 may be used as a term including all of at least twocamera units121a,121bhaving different capture directions.
Thecontroller180 may display animage1510 received from thecamera unit121a. Here, when a preset type of touch is received, thecontroller180 may activate thecamera unit121bin which the receivedimage1510 has a different capture direction from the capture direction. Furthermore, thecontroller180 may display animage1520 received from thecamera unit121bhaving the different capture direction to be overlapped with at least part of the receivedimage1510.
For example, as illustrated inFIGS. 15A(a) through15A(c), when a preset type of touch is applied, thecontroller180 may display images received from thecamera units121a,121bhaving different capture direction at the same time. Here, a preset type of touch may be a touch applied to thesecond region262,263. At this time, though thecamera units121a,121bhaving different capture directions can be used, the present disclosure may be configured to receive images captured by the rotation of thecamera unit121.
For another example, as illustrated inFIGS. 15B(a) to15B(c), thecontroller180 may display animage810 received through either onecamera unit121abetween thecamera units121a,121bhaving opposite capture directions to each other. At this time, when a preset type of touch is sensed, thecontroller180 may display images received from the any onecamera unit121aand theother camera unit121bin at least part1530 of the displayedimage1510.
On the other hand, an image received from thecamera unit121bhaving the different capture direction may be applicable to all the foregoing embodiments of screen information having a second display direction.
In the above, a method of displaying images received from camera units having different capture directions has been described. Through this, the user can activate the camera units using only touch manipulation. Furthermore, the user can compare images captured in various capture directions at once, thereby providing the convenience of capture.
Hereinafter, a specific region of the screen information having a first display direction being set to screen information having a second display direction will be described.FIGS. 16A to 16B are a conceptual view illustrating that a specific region of the screen information having a first display direction is set to screen information having a second display direction.
When a preset type of touch applied to thesecond region262,263 is sensed in a state that screen information having a first display direction is displayed in thefirst region261, thecontroller180 may change at least some of the display directions of the screen information having the first display direction.
At this time, the screen information changed in the display direction may be displayed to be overlapped with the screen information having the first display direction.
On the other hand, the screen information changed in the display direction may be screen information determined that being displayed to have a second display direction among screen information having the first display direction is more convenient. For example, when the screen information having the first display direction is document information, the screen information having the second display direction may be an image contained in the document information.
For example, as illustrated inFIG. 16A, a drag input may be sensed in thesecond region262,263 in a state that documentinformation1610 is displayed in thefirst region261. At this time, thecontroller180 may determine the content of thedocument information1610, and detect object information to be displayed in a second display direction which is different from the first display direction. For example, the object information may be an image920. Then, as illustrated inFIG. 16B, thecontroller180 may display the detectedimage1620 in at least part of thedocument information1610.
Through this, the user may view screen information on which the display direction is changed only in at least part thereof, thereby providing screen information with a higher readability.
According to the present disclosure, the direction of screen information displayed on the display unit may be implemented with only a simple manipulation, thereby providing images having various display directions to the user.
Furthermore, according to the present disclosure, the display direction of screen information displayed on the display unit may be changed through a manipulation to a lateral surface of the display unit, thereby providing a screen with a higher readability to the user.
The foregoing present invention may be implemented as codes readable by a computer on a medium written by the program. The computer-readable media may include all kinds of recording devices in which data readable by a computer system is stored. Examples of the computer-readable media may include hard disk drive (HDD), solid state disk (SSD), ROM, RAM, CD-ROM, magnetic tape, floppy disk, and optical data storage device, and the like, and also include a device implemented in the form of a carrier wave (for example, transmission via the Internet). In addition, the computer may include thecontroller180 of the terminal. Accordingly, the detailed description thereof should not be construed as restrictive in all aspects but considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims and all changes that come within the equivalent scope of the invention are included in the scope of the invention.