Movatterモバイル変換


[0]ホーム

URL:


CN108984075B - Display mode switching method, device and terminal - Google Patents

Display mode switching method, device and terminal
Download PDF

Info

Publication number
CN108984075B
CN108984075BCN201710400431.0ACN201710400431ACN108984075BCN 108984075 BCN108984075 BCN 108984075BCN 201710400431 ACN201710400431 ACN 201710400431ACN 108984075 BCN108984075 BCN 108984075B
Authority
CN
China
Prior art keywords
display
terminal
display mode
content
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710400431.0A
Other languages
Chinese (zh)
Other versions
CN108984075A (en
Inventor
吕林军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co LtdfiledCriticalHuawei Technologies Co Ltd
Priority to CN201710400431.0ApriorityCriticalpatent/CN108984075B/en
Publication of CN108984075ApublicationCriticalpatent/CN108984075A/en
Application grantedgrantedCritical
Publication of CN108984075BpublicationCriticalpatent/CN108984075B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

本申请实施例提供了一种显示模式切换方法、装置及终端,该终端包括:第一处理器、第二处理器和显示器,第一处理器和第二处理器分别与显示器连接,显示器划分为若干个显示区域,分别用于在终端处于不同的显示模式时进行内容显示,该显示模式包括平面显示模式、AR显示模式和组合显示模式中的一种或多种,第一处理器,用于在终端处于平面显示模式或组合显示模式时,控制当前显示模式对应的显示区域显示平面内容;第二处理器,用于在终端处于AR显示模式或组合显示模式时,控制当前显示模式对应的显示区域显示透射的环境图像和预先获取的AR图像组成的AR内容。通过本申请实施例可以提供便捷、实时的AR体验,且AR实现方式的通用性强。

Figure 201710400431

Embodiments of the present application provide a display mode switching method, device, and terminal. The terminal includes: a first processor, a second processor, and a display. The first processor and the second processor are respectively connected to the display, and the display is divided into Several display areas are respectively used to display content when the terminal is in different display modes, the display modes include one or more of a flat display mode, an AR display mode, and a combined display mode, and the first processor is used for When the terminal is in the plane display mode or the combined display mode, control the display area corresponding to the current display mode to display plane content; the second processor is used for controlling the display corresponding to the current display mode when the terminal is in the AR display mode or the combined display mode The area displays the AR content composed of the transmitted environment image and the pre-acquired AR image. The embodiments of the present application can provide a convenient and real-time AR experience, and the AR implementation manner is highly versatile.

Figure 201710400431

Description

Display mode switching method and device and terminal
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a display mode switching method, an apparatus, and a terminal.
Background
The Augmented Reality (AR) technology is a new computer application and human-computer interaction technology developed on the basis of the Virtual Reality (VR) technology, Virtual information is applied to the real world by means of the computer and the visualization technology, and a real environment and a Virtual object are overlaid on the same picture or space in real time and exist at the same time.
The current popular AR implementation principle for installing AR application APP on a terminal is as follows: utilize the terminal multi-angle to shoot a certain object, save as the 3D stereoscopic model of this object, then at the in-process of shooting peripheral real environment image, output AR image after rendering again the 3D stereoscopic model of this object and realizing environment image, it is consuming time longer to render again real environment image and 3D stereoscopic model, the AR image of output compares with real environment image and can have the time delay that the user can obviously perceive, the real-time is relatively poor, try different AR effects and need install a plurality of AR APPs usually, the mode commonality that leads to present AR to realize is poor.
Content of application
The embodiment of the application provides a display mode switching method, a display mode switching device and a display mode switching terminal, which can provide convenient and real-time AR experience and have strong universality of an AR implementation mode.
A first aspect of an embodiment of the present application provides a terminal, including a first processor, a second processor, and a display, where the first processor and the second processor are respectively connected to the display, the display is divided into a plurality of display areas, and the plurality of display areas are respectively used for displaying content when the terminal is in different display modes, and the display modes specifically may include one or more of a flat display mode, an AR display mode, and a combined display mode, and when the terminal is in the combined display mode, the terminal simultaneously displays the flat content and the AR content, and the AR content includes a transmissive environment image and a pre-acquired AR image, where:
and the first processor is used for controlling the display area corresponding to the current display mode to display the plane content when the terminal is in the plane display mode or the combined display mode. And the second processor is used for controlling the display area corresponding to the current display mode to display the AR content when the terminal is in the AR display mode or the combined display mode, the first processor is responsible for processing the display of the plane content, and the second processor is responsible for processing the display of the AR content, so that convenient and real-time AR experience can be provided, and the universality of an AR implementation mode is high.
Optionally, the plurality of display areas are specifically two display areas, that is, a planar display area and a multifunctional display area, where the multifunctional display area is used for displaying planar contents or AR contents, where:
the first processor is specifically configured to control the plane display area and the multifunctional display area to display the plane content when the terminal is in the plane display mode, or control the plane display area to display the plane content when the terminal is in the combined display mode.
And the second processor is specifically used for controlling the multifunctional display area to display the AR content when the terminal is in the AR display mode or the combined display mode.
Optionally, the plurality of display areas are specifically a display area, that is, a multifunctional display area, and the multifunctional display area is used for displaying plane content and/or AR content, where:
and the first processor is specifically used for controlling the multifunctional display area to display the plane content when the terminal is in the plane display mode.
And the second processor is specifically used for controlling the multifunctional display area to display the AR content when the terminal is in the AR display mode.
Optionally, the first processor is specifically further configured to control a partial area of the multifunctional display area to display the planar content when the terminal is in the combined display mode.
And the second processor is specifically further used for controlling areas except the partial area in the multifunctional display area to display the AR content when the terminal is in the combined display mode.
Optionally, the multifunctional display area is specifically composed of a transparent display screen, a micro projector, a lens, and a mirror, wherein:
and the lens is used for transmitting the environment image. And when the transparent display screen is started, the transparent display screen is used for displaying the plane content. A micro projector for projecting an AR image through the lens onto the mirror when the transparent display screen is off.
Optionally, the multifunctional display area specifically comprises a transparent display screen, a liquid crystal display screen, a lens and a background light source, wherein:
and the liquid crystal display screen is used for transmitting the environment image. And when the transparent display screen is started, the transparent display screen is used for displaying the plane content. And when the transparent display screen is closed, the background light source is used for projecting the AR image onto the liquid crystal display screen through the lens.
Optionally, the plurality of display areas are specifically two display areas, namely a planar display area and an AR display area, where:
the first processor is specifically configured to control the planar display area to display the planar content when the terminal is in the planar display mode or the combined display mode.
And the second processor is specifically used for controlling the AR display area to display the AR content when the terminal is in the AR display mode or the combined display mode.
Optionally, the AR display area is comprised of a first mirror, a micro projector, a lens, and a second mirror, wherein:
the first lens and the second lens are used for transmitting the environment image. A micro projector for projecting the AR image onto the second lens through the lens.
Optionally, the AR display area is composed of a lens, a liquid crystal display, a lens, and a background light source, wherein:
the liquid crystal display screen and the lens are used for transmitting the environment image. And the background light source is used for projecting the AR image onto the liquid crystal display screen through the lens.
Optionally, the first processor is a central processing unit, and the second processor is a graphics processor independent of the first processor.
A second aspect of the embodiments of the present application provides a display mode switching method, including:
when the terminal is in a plane display mode, if an operation instruction is received, whether the operation instruction is an instruction for starting displaying the AR content is judged, if so, the display mode of the control terminal is switched to a combined display mode or an AR display mode from the plane display mode, wherein the AR content comprises a transmitted environment image and an AR image acquired in advance, and the terminal simultaneously displays the plane content and the AR content when in the combined display mode, so that the diversification of the display mode of the terminal can be realized, the switching of the display mode can be conveniently completed, the AR experience is rapidly performed, and the practicability of the terminal is improved.
Optionally, the specific implementation of switching the terminal from the flat display mode to the combined display mode or the AR display mode may be:
the terminal judges whether the display plane content needs to be reserved or not, if so, the terminal is switched from the plane display mode to the combined display mode; if not, the terminal is switched from the plane display mode to the AR display mode, so that only the AR content can be displayed according to the actual requirement of the user when the AR content needs to be displayed, or the plane content is kept to be displayed while the AR content is displayed.
Optionally, the specific implementation manner of the terminal determining whether the operation instruction is an instruction for starting displaying the AR content may be:
the terminal judges whether the operation instruction is an instruction for a user of the terminal to operate the AR image file, and if yes, the terminal determines that the operation instruction is an instruction for starting displaying the AR content.
Or if the operation instruction is not the instruction of the terminal user for operating the AR image file, the terminal judges whether the operation instruction is the instruction of calling the AR image file by the terminal application, and if so, the terminal determines that the operation instruction is the instruction for starting displaying the AR content.
Optionally, the method may further include:
if the terminal is in the AR display mode, the terminal judges whether the operation instruction is an instruction for starting displaying the plane content, and if so, the terminal is switched from the AR display mode to the combined display mode or the plane display mode.
Optionally, the specific implementation of switching the terminal from the AR display mode to the combined display mode or the planar display mode may be:
the terminal judges whether the AR content needs to be kept and displayed or not, if so, the terminal is switched from the AR display mode to the combined display mode; if not, the terminal is switched from the AR display mode to the plane display mode, so that when the plane content needs to be displayed, only the plane content can be displayed according to the actual requirement of the user, or the plane content is displayed while the AR content is kept to be displayed.
Optionally, the method may further include:
if the terminal is in the combined display mode, the terminal judges whether the operation instruction is an instruction for closing the display plane content or the AR content;
and if the operation instruction is an instruction for closing the display plane content, the terminal is switched from the combined display mode to the AR display mode. And if the operation instruction is an instruction for closing the display of the AR content, the terminal is switched from the combined display mode to the plane display mode, so that only the display plane content is reserved or only the AR content is reserved and displayed according to the actual requirement of the user.
A third aspect of the embodiments of the present application provides a display mode switching apparatus, including:
and the receiving module is used for receiving the operation instruction.
And the judging module is used for judging whether the operation instruction is an instruction for starting displaying the AR content if the terminal is in the plane display mode, wherein the AR content comprises a transmitted environment image and a pre-acquired AR image.
And the control module is used for switching the plane display mode into the combined display mode or the AR display mode if the judgment result of the judgment module is yes, and simultaneously displaying the plane content and the AR content when the terminal is in the combined display mode, so that the diversification of the terminal display mode can be realized, the switching of the display mode can be conveniently completed, the AR experience is rapidly carried out, and the practicability of the terminal is improved.
Optionally, the control module is specifically configured to:
judging whether the display plane content needs to be reserved or not, if so, switching the plane display mode to a combined display mode by the control terminal; if not, the control terminal is switched from the plane display mode to the AR display mode.
Optionally, the determining module is specifically configured to:
and judging whether the operation instruction is an instruction for a user of the terminal to operate the AR image file, if so, determining that the operation instruction is an instruction for starting displaying the AR content.
Further, the judgment module is specifically further configured to:
if the operation instruction is not the instruction of the terminal user for operating the AR image file, judging whether the operation instruction is the instruction of calling the AR image file by the terminal application, and if so, determining that the operation instruction is the instruction for starting displaying the AR content.
Optionally, the determining module is further configured to determine whether the operation instruction is an instruction to start displaying the planar content if the terminal is in the AR display mode.
And the control module is further used for switching the AR display mode to the combined display mode or the plane display mode if the judgment result of the judgment module is positive.
Optionally, the control module is specifically configured to:
and judging whether the AR content needs to be kept displayed or not. If so, the control terminal is switched from the AR display mode to the combined display mode; if not, the control terminal is switched from the AR display mode to the plane display mode.
Optionally, the determining module is further configured to determine whether the operation instruction is an instruction to close the display plane or the AR content if the terminal is in the combined display mode.
The control module is further used for switching the combined display mode to the AR display mode if the operation instruction is an instruction for closing the display plane content; and if the operation instruction is an instruction for closing the display of the AR content, the control terminal is switched from the combined display mode to the plane display mode.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium having stored therein instructions, which, when executed on a computer, cause the computer to perform the method of the second aspect.
A fifth aspect of the embodiments of the present application provides a computer program product containing instructions, which when run on a computer, causes the computer to perform the method according to the second aspect.
The terminal in the embodiment of the application comprises: first treater, second treater and display, first treater and second treater are connected with the display respectively, and the display divides into a plurality of display area, wherein: the terminal comprises a plurality of display areas and a first processor, wherein the display areas are respectively used for displaying contents when the terminal is in different display modes, the display modes comprise one or more of a plane display mode, an AR display mode and a combined display mode, the terminal simultaneously displays the plane contents and the AR contents when in the combined display mode, the AR contents comprise transmitted environment images and pre-acquired AR images, and the first processor is used for controlling the display area corresponding to the current display mode to display the plane contents when the terminal is in the plane display mode or the combined display mode; and the second processor is used for controlling the display area corresponding to the current display mode to display the AR content when the terminal is in the AR display mode or the combined display mode, so that convenient and real-time AR experience can be provided, and the universality of an AR implementation mode is high.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
Fig. 1 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 2a is a schematic diagram illustrating a display area division of a display according to an embodiment of the present disclosure;
fig. 2b is a schematic structural diagram of another terminal provided in the embodiment of the present application;
FIG. 2c is a schematic diagram illustrating a display area division of another display according to an embodiment of the present disclosure;
fig. 2d is a schematic structural diagram of another terminal provided in the embodiment of the present application;
fig. 2e is a schematic structural diagram of a multifunctional display area provided in an embodiment of the present application;
FIG. 2f is a schematic diagram of an optical path of a multifunctional display area according to an embodiment of the present disclosure;
FIG. 2g is a schematic structural diagram of another multifunctional display area provided in an embodiment of the present application;
FIG. 2h is a schematic diagram of an optical path of another multifunctional display area provided in the embodiments of the present application;
FIG. 2i is a schematic diagram illustrating a display area division of another display according to an embodiment of the present disclosure;
fig. 2j is a schematic structural diagram of another terminal provided in the embodiment of the present application;
FIG. 2k is a schematic structural diagram of another multifunctional display area provided in an embodiment of the present application;
FIG. 2l is a schematic structural diagram of another multifunctional display area provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of a display mode switching method according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a display mode switching apparatus according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be described below with reference to the drawings.
The terminal described in the embodiment of the present application may specifically include but is not limited to: smart phones, tablet computers, intelligent wearable devices, smart televisions, vehicle-mounted terminals and the like.
Please refer to fig. 1, which is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal described in this embodiment, taking the terminal as a mobile phone as an example, includes: radio Frequency (RF)circuitry 110,memory 120,other input devices 131,camera 132,sensor 150,display 140,audio circuitry 160, I/O subsystem 170,first processor 181,second processor 182, and power supply 190. Those skilled in the art will appreciate that the handset configuration shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or may combine certain components, or split certain components, or arranged in different components.
The following describes the components of the mobile phone in detail:
theRF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for receiving downlink information of a base station and then processing the received downlink information to thefirst processor 181 and/or thesecond processor 182; in addition, the data for designing uplink is transmitted to the base station. In general, theRF circuit 110 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, theRF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
Memory 120 may be used to store computer-executable program code, including instructions; thefirst processor 181 and/or thesecond processor 182 perform various functional applications and data processing of the cellular phone by executing software programs and modules stored in thememory 120. Wherein, the storage program area can store an operating system, application programs (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. In addition, theMemory 120 may include a Read Only Memory (ROM) and a Random Access Memory (RAM), and may further include a high speed Random Access Memory, a non-volatile Memory, such as at least one disk storage device, a flash Memory device, or other volatile solid state storage device.
Theother input device 131 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. In particular,other input devices 131 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, a light mouse (a light mouse is a touch-sensitive surface that does not display visual output, or is an extension of a touch-sensitive surface formed by a touch screen), and the like. Theother input device 131 is connected to the otherinput device controller 171 of the I/O subsystem 170, and performs signal interaction with thefirst processor 181 and/or thesecond processor 182 under the control of the otherinput device controller 171.
At least onesensor 150, such as a fingerprint sensor, a light sensor, a motion sensor, a gravity sensor, a gyroscope, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of thedisplay panel 141 according to the brightness of ambient light, and a proximity sensor that turns off thedisplay panel 141 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Thecamera 132 can be used for collecting image data of people, objects and the like in the environment where the mobile phone is located, the mobile phone utilizes the camera to take pictures, record videos and the like, and thecamera 132 can be a single camera, a double camera or three or more cameras. Thecamera 132 is connected to acamera controller 172 of the I/O subsystem 170, and thecamera controller 172 sends image data collected by thecamera 132 to thefirst processor 181 and/or thesecond processor 182 for processing.
Thedisplay 140 may be used to display information input by or provided to the user as well as various menus of the handset, and may also receive user input. Thedisplay 140 may include adisplay panel 141 and atouch panel 142. Thedisplay panel 141 specifically includes aflat display module 1411, anAR display module 1412, and atransparent display module 1413. The flatpanel Display module 1411 is configured to Display a flat panel content, where the flat panel content refers to a multimedia resource locally stored in a terminal or a two-dimensional image such as a multimedia resource on a network, and may adopt a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), and the like. TheAR display module 1412 is configured to display the environment image and the AR content composed of the AR image, that is, thedisplay 140 displays the virtual AR image acquired in advance and simultaneously displays the content (i.e., the environment image) transmitted from the real world in real time. TheAR display module 1412 and thetransparent display module 1413 may be used in combination to display AR content or flat content, specifically, when thetransparent display module 1413 is turned off, the AR content is displayed, and when thetransparent display module 1413 is turned on, the flat content is displayed.
In some possible embodiments, theflat display module 1411 or thetransparent display module 1413 can be omitted.
Thetouch panel 142, also referred to as a touch screen, a touch sensitive screen, etc., may collect contact or non-contact operations (for example, operations performed by a user on or near thetouch panel 142 using any suitable object or accessory such as a finger, a stylus, etc., and may also include body sensing operations, including single-point control operations, multi-point control operations, etc., and drive the corresponding connection device according to a preset program, and optionally, thetouch panel 142 may include two parts, i.e., a touch detection device and a touch controller, wherein the touch detection device detects a touch orientation and a touch gesture of the user and detects a signal resulting from the touch operation and transmits the signal to the touch controller, and the touch controller receives touch information from the touch detection device and converts the touch information into information capable of being processed by thefirst processor 181 and thesecond processor 182, then sent to thefirst processor 181 and/or thesecond processor 182, and can receive and execute the command sent by thefirst processor 181 and/or thesecond processor 182. In addition, thetouch panel 142 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, a surface acoustic wave, and the like, and thetouch panel 142 may also be implemented by any technology developed in the future. Further, thetouch panel 142 may cover thedisplay panel 141, a user may operate on or near thetouch panel 142 covered on thedisplay panel 141 according to the content displayed on the display panel 141 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, virtual keys, icons, etc.), after thetouch panel 142 detects a touch operation on or near the touch panel, thetouch panel 142 transmits the touch operation to thefirst processor 181 and/or thesecond processor 182 through the I/O subsystem 170 to determine the type of touch event to determine a user input, and then thefirst processor 181 and/or thesecond processor 182 provides a corresponding visual output on thedisplay panel 141 through the I/O subsystem 170 according to the type of touch event. Although thetouch panel 142 and thedisplay panel 141 are shown as two separate components in fig. 1 to implement the input and output functions of the mobile phone, in some embodiments, thetouch panel 142 and thedisplay panel 141 may be integrated to implement the input and output functions of the mobile phone.
Audio circuitry 160,speaker 161, and microphone 162 may provide an audio interface between a user and a cell phone. Theaudio circuit 160 may transmit the converted signal of the received audio data to thespeaker 161, and convert the signal into a sound signal for output by thespeaker 161; on the other hand, the microphone 162 converts the collected sound signals into signals, which are received by theaudio circuit 160 and converted into audio data, which are then output to theRF circuit 110 for transmission to, for example, another cell phone, or output to thememory 120 for further processing.
The I/O subsystem 170 controls input and output of external devices, and may include otherinput device controllers 171, acamera controller 172, asensor controller 173, and adisplay controller 174. Optionally, one or more otherinput device controllers 171 receive signals from and/or send signals toother input devices 131, andother input devices 131 may include physical buttons (push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels, a light mouse (a light mouse is a touch-sensitive surface that does not display visual output, or is an extension of a touch-sensitive surface formed by a touch screen). It is noted that otherinput device controllers 171 may be connected to any one or more of the above-described devices. Thecamera controller 172 may provide image data captured by thecamera 132 to thefirst processor 181 and/or thesecond processor 182 for processing. Thesensor controller 173 may receive signals from one ormore sensors 150 and/or send signals to one ormore sensors 150. Thedisplay controller 174 of the I/O subsystem 170 receives signals from thedisplay 140 and/or sends signals to thedisplay 140. Upon detection of user input by thedisplay 140, thedisplay controller 174 converts the detected user input into interaction with a user interface object displayed on thedisplay 140, i.e., to implement human-machine interaction.
Thefirst processor 181 is a control center of the mobile phone, and may specifically be a Central Processing Unit (CPU), and connects various parts of the whole mobile phone by using various interfaces and lines, and executes various functions and processes data of the mobile phone by running or executing software programs and/or modules stored in thememory 120 and calling data stored in thememory 120, so as to perform overall monitoring on the mobile phone, where thefirst processor 181 is used to process the display of the flatpanel display module 1411 in thedisplay 140. Alternatively, thefirst processor 181 may include one or more processing units; preferably, thefirst processor 181 may integrate an application processor, which mainly handles an operating system, a user interface, application programs, etc., and a modem processor, which mainly handles wireless communication. It will be appreciated that the modem processor described above may not be integrated into the processorfirst processor 181.
Thesecond processor 182 may be dedicated to Processing the display of theAR display module 1412 and thetransparent display module 1413 in thedisplay 140, and may be a Graphics Processing Unit (GPU) independent from thefirst processor 181.
In some possible embodiments, thesecond processor 182 may be specifically a GPU built in thefirst processor 181.
In some possible implementations, thesecond processor 182 may be integrated with thefirst processor 181 as one processor.
The handset also includes a power supply 190 (e.g., a battery) for powering the various components, which may preferably be logically coupled to thefirst processor 181 and thesecond processor 182 via a power management system to manage charging, discharging, and power consumption via the power management system.
Although not shown, the mobile phone may further include a bluetooth module, a wireless fidelity Wi-Fi module, etc., which are not described herein.
Thefirst processor 181 and thesecond processor 182 may share a set of peripheral devices such as an I/O subsystem, a memory, and an audio circuit as shown in fig. 1, or may use a set of peripheral devices such as an I/O subsystem, a memory, and an audio circuit, respectively.
In a specific implementation, thedisplay 140 may be divided into a plurality of display areas, and the display areas are respectively corresponding to content display when the terminal is in different display modes, where the specific display mode may include one or more of a planar display mode, an AR display mode, and a combined display mode. When the terminal is in the flat display mode, a display area corresponding to the flat display mode is used for displaying flat content, and the flat content refers to a multimedia resource locally stored by the terminal or a two-dimensional image such as a multimedia resource on a network. When the terminal is in the AR display mode, the display area corresponding to the AR display mode is used to display the environment image and the AR content composed of the AR image, that is, thedisplay 140 displays the virtual AR image acquired in advance and simultaneously displays the content (i.e., the environment image) transmitted from the real world in real time. When the terminal is in the combined display mode, the display area corresponding to the combined display mode is used for displaying the planar content and the AR content, for example, a part of the display area of thedisplay 140 displays the planar content, and another part of the display area displays the AR content. The virtual AR image can be obtained through an AR image file stored locally or through downloading the AR image file through a network.
Thefirst processor 181 is configured to control a display area corresponding to the current display mode to display the planar content when the terminal is in the planar display mode or the combined display mode. For example, when the terminal is in the flat display mode, thefirst processor 181 controls a display area corresponding to the flat display mode (i.e., the current display mode) to display the flat content.
And thesecond processor 182 is configured to control the display area corresponding to the current display mode to display the AR content when the terminal is in the AR display mode or the combined display mode. For example, when the terminal is in the AR display mode, thesecond processor 182 controls the display area corresponding to the AR display mode (i.e., the current display mode) to display the AR content.
In this embodiment, the dividing manner of the display area of thedisplay 140 may include:
in a first mode, the display areas divided by thedisplay 140 are a planar display area and a multifunctional display area, and the multifunctional display area is used for displaying planar contents or AR contents, as shown in fig. 2a, where:
the plane display mode corresponds to the plane display area and the multifunctional display area, the AR display mode corresponds to the multifunctional display area, and the combined display mode corresponds to the plane display area and the multifunctional display area. When the terminal is in a plane display mode or a combined display mode, the plane display area and the multifunctional display area are started to display; when the terminal is in the AR display mode, the multifunctional display area is started to display, and the plane display area is closed to display.
Thefirst processor 181 may be specifically configured to control the plane display area and the multifunctional display area to display the plane content when the terminal is in the plane display mode, or control the plane display area to display the plane content when the terminal is in the combined display mode.
Thesecond processor 182 may be specifically configured to control the multifunctional display area to display the AR content when the terminal is in the AR display mode or the combined display mode.
At this time, the schematic structural diagram of the corresponding terminal is as shown in fig. 2b, and the transparent display module and the AR display module are sequentially disposed on the back of the multifunctional display area. The back of the plane display area is sequentially provided with a plane display module and a terminal circuit.
The plane display area and the multifunctional display area can simultaneously display, for example, the plane display area can display plane contents or AR contents while the plane display area displays the plane contents, and the two display areas work independently and do not influence each other.
It should be noted that if the flat display area and the multifunctional display area both display the flat content, the two areas can display the same flat content, that is, when the terminal is not in the AR display mode or the combined display mode (i.e., the AR content does not need to be displayed), thewhole display 140 can be used as the display area of the flat content.
The positions of the flat display area and the multi-function display area can be interchanged, and besides thedisplay 140 is divided into the multi-function display area and the flat display area in the vertical direction as shown in fig. 2a, thedisplay 140 can be divided into the multi-function display area and the flat display area in the left-right direction, the relative positions are not limited, the multi-function display area can be on the left side, and the flat display area can be on the right side; it is also possible that the multi-function display area is on the right side and the flat display area is on the left side.
In a second mode, the display areas divided by thedisplay 140 are multi-function display areas, and the multi-function display areas are used for displaying plane content and/or AR content, as shown in fig. 2c, where:
the display mode of the terminal may include only a plane display mode and an AR display mode, both of which correspond to the multi-functional display area.
Thefirst processor 181 may be specifically configured to control the multifunctional display area to display the planar content when the terminal is in the planar display mode.
Thesecond processor 182 may be specifically configured to control the multifunctional display area to display the AR content when the terminal is in the AR display mode.
In the second mode, the display mode of the terminal may further include a combined display mode, and the combined display mode also corresponds to the multifunctional display area, where:
thefirst processor 181 may be further configured to control a partial area of the multi-function display area to display the planar content when the terminal is in the combined display mode.
Thesecond processor 182 may be further configured to control an area other than the partial area in the multi-function display area to display the AR content when the terminal is in the combined display mode.
At this time, the schematic structural diagram of the corresponding terminal is as shown in fig. 2d, and the transparent display module, the AR display module and the terminal circuit are sequentially disposed on the back of the multifunctional display area. The positions of the AR display module and the terminal circuit can be interchanged.
As for the first and second modes, the specific structural composition of the multifunctional display area may include a transparent display screen, a micro projector, a lens and a lens, which are arranged in sequence, as shown in fig. 2e, wherein:
the transparent display screen correspondingly forms a transparent display module, and the micro projector, the lens and the lens correspondingly form an AR display module.
And the lens is used for transmitting the environment image.
And when the transparent display screen is turned on and the micro projector is turned off, the transparent display screen is used for displaying the plane content.
And a micro projector for projecting the AR image onto the mirror through the lens when the transparent display screen is turned off and the micro projector is turned on. The lens may be a convex lens, and is used to shorten the focal length of the AR image projection, which may reduce the thickness of thedisplay 140.
The corresponding light path principle when the multifunctional display area displays AR content is shown in fig. 2f, the micro projector is turned on, the AR image is projected to the lens through the lens, a virtual image is formed after the AR image is projected onto the lens, human eyes can see the virtual image (such as a cross virtual image in fig. 2 f) transmitted from the lens on one side of the transparent display screen, and the environment image (such as an arrow real image in fig. 2 f) positioned below the lens is seen by the human eyes positioned on one side of the transparent display screen after the environment image passes through the lens, so that thedisplay 140 can display the virtual AR image and simultaneously display the content (namely the environment image) transmitted from the real world, and the AR display is realized.
As for the first and second modes, the specific structural composition of the multifunctional display area may also include a transparent display screen, a liquid crystal display screen, a lens and a background light source, which are sequentially arranged as shown in fig. 2g, wherein:
transparent display screen corresponds and constitutes transparent display module assembly, and liquid crystal display, lens and background light source correspond and constitute AR display module assembly.
And the liquid crystal display screen is used for transmitting the environment image.
And when the transparent display screen is started and the background light source is closed, the transparent display screen is used for displaying the plane content.
When the transparent display screen is closed and the background light source is opened, the background light source is used for projecting the AR image onto the liquid crystal display screen through the lens.
The light path principle that corresponds when multi-functional display area shows AR content is shown as figure 2h, background light opens, through lens with the AR image to liquid crystal display projection, the AR image evenly shows on liquid crystal display, the virtual image that becomes on liquid crystal display (like the cross virtual image in figure 2 h) can be seen to people's eye in transparent display one side, the environment image that is located the liquid crystal display below (like the arrow real image in figure 2 h) is after liquid crystal display's transmission, can be seen by the people's eye that is located transparent display one side, thereby display 140 is except showing virtual AR image, still show simultaneously from the content (being environment image) that the real world transmission comes, thereby AR shows has been realized.
In a third mode, the display areas divided by thedisplay 140 are a planar display area and an AR display area, as shown in fig. 2i, where:
the plane display mode corresponds to the plane display area, the AR display mode corresponds to the AR display area, and the combined display mode corresponds to the plane display area and the AR display area. When the terminal is in a plane display mode, the plane display area starts to display, and the AR display area closes to display; when the terminal is in an AR display mode, starting display in an AR display area, and closing display in a plane display area; and when the terminal is in the combined display mode, the plane display area and the AR display area are both opened for display.
Thefirst processor 181 may be specifically configured to control the planar display area to display the planar content when the terminal is in the planar display mode or the combined display mode.
Thesecond processor 182 may be specifically configured to control the AR display area to display the AR content when the terminal is in the AR display mode or the combined display mode.
The positions of the plane display area and the AR display area may be interchanged, and besides thedisplay 140 is divided into the AR display area and the plane display area from top to bottom as shown in fig. 2i, thedisplay 140 may be divided into the AR display area and the plane display area from left to right, and the relative positions are not limited, and the AR display area may be on the left side and the plane display area may be on the right side; the AR display area may be on the right side and the flat display area on the left side.
At this time, the schematic structural diagram of the corresponding terminal is shown in fig. 2j, and an AR display module is disposed on the back of the AR display area. The back of the plane display area is sequentially provided with a plane display module and a terminal circuit.
The plane display area and the AR display area can display simultaneously, for example, when the terminal is in a combined display mode, the plane display area displays plane content, the AR display area can display AR content, and the two display areas work independently and do not affect each other.
For the third mode, the specific structural composition of the AR display area may be as shown in fig. 2k, and includes a first mirror, a micro projector, a lens, and a second mirror, which are arranged in sequence, wherein:
the first lens and the second lens are used for transmitting the environment image.
The micro projector is used to project an AR image onto the second lens through the lens when turned on.
The light path principle that corresponds when the AR display area displays AR content can refer to fig. 2f, the micro projector is turned on, the AR image is projected to the second lens through the lens, a virtual image is formed after being projected on the second lens, human eyes can see the virtual image transmitted from the second lens on one side of the first lens, and an environment image (such as an arrow real image in fig. 2 f) located below the second lens can be seen by the human eyes located on one side of the first lens after being transmitted through the second lens, so that thedisplay 140 displays the virtual AR image, and simultaneously displays the content (namely, the environment image) transmitted from the real world, thereby realizing AR display.
For the third mode, the specific structural composition of the AR display area may also include, as shown in fig. 2l, a lens, a liquid crystal display, a lens, and a background light source, which are arranged in sequence, where:
the liquid crystal display screen and the lens are used for transmitting the environment image.
When the background light source is turned on, the AR image is projected onto the liquid crystal display screen through the lens.
The light path principle that corresponds when AR shows regional display AR content can refer to fig. 2h, background light opens, through lens with the projection of AR image to liquid crystal display, the AR image evenly shows on liquid crystal display, people's eye can see the virtual image that becomes on the liquid crystal display in lens one side, the environment image that is located the liquid crystal display below (like the arrow real image in fig. 2 h) is after liquid crystal display's transmission, can be seen by the people's eye that is located lens one side, thereby display 140 is except showing virtual AR image, still show simultaneously from the content (being environment image) that the real world transmission comes, thereby AR has been realized and has shown.
In the embodiment of the present application, thefirst processor 181 and thesecond processor 182 are respectively connected to thedisplay 140, and thedisplay 140 is divided into a plurality of display areas, where: the plurality of display areas are respectively used for displaying contents when the terminal is in different display modes, the display modes include one or more of a plane display mode, an AR display mode and a combined display mode, and thefirst processor 181 is used for controlling the display area corresponding to the current display mode to display plane contents when the terminal is in the plane display mode or the combined display mode; thesecond processor 182 is configured to control the display area corresponding to the current display mode to display the AR content formed by the environment image and the AR image when the terminal is in the AR display mode or the combined display mode, so that a convenient and real-time AR experience can be provided, the universality of the AR implementation mode is strong, and the practicability of the terminal is improved.
Please refer to fig. 3, which is a flowchart illustrating a display mode switching method according to an embodiment of the present disclosure. The display mode switching method described in this embodiment is applied to the terminal shown in fig. 1, and the method includes:
301. and the terminal receives the operation instruction.
The display of the terminal may be divided into a plurality of display areas, different display modes are defined for the terminal, the plurality of display areas are respectively used for displaying content when the terminal is in different display modes, and the display modes may specifically include one or more of a flat display mode, an AR display mode, and a combined display mode. When the terminal is in the flat display mode, a display area corresponding to the flat display mode is used for displaying flat content, and the flat content refers to a multimedia resource locally stored by the terminal or a two-dimensional image such as a multimedia resource on a network. When the terminal is in the AR display mode, the display area corresponding to the AR display mode is used for displaying the environment image and the AR content formed by the AR image, namely, the display not only displays the virtual AR image, but also simultaneously displays the content (namely, the environment image) transmitted from the real world. When the terminal is in the combined display mode, the display area corresponding to the combined display mode is used for displaying the planar content and the AR content, for example, a part of the display area of the display displays the planar content, and another part of the display area displays the AR content.
The operation instruction may specifically be a touch operation instruction input by a user or a non-touch gesture operation instruction, may also be an operation instruction input by a user through a physical key, and may also be an instruction triggered by the application when the user clicks a menu option or a link in an application interface of the application.
302. If the terminal is in a plane display mode, the terminal judges whether the operation instruction is an instruction for starting displaying the AR content, if so, executingstep 303; if not, the process is ended.
303. And the terminal is switched from the plane display mode to a combined display mode or an AR display mode.
In a specific implementation, when the terminal is in a flat display mode (that is, a display area corresponding to the terminal displays flat content), a user of the terminal may trigger the terminal to display the AR content by operating an AR image file (for example, clicking the AR image file), and if the terminal determines that the operation instruction is an instruction for the user to operate the AR image file, the terminal may determine that the operation instruction is an instruction for starting displaying the AR content, and then the display mode of the terminal is switched from the flat display mode to a combined display mode or an AR display mode.
In some feasible embodiments, when the terminal is in the flat display mode, the terminal may also be triggered to display the AR content by the application of the terminal in a manner of calling the AR image file, and if the terminal determines that the operation instruction is specifically an instruction for calling the AR image file by the application of the terminal, the terminal may also determine that the operation instruction is an instruction for starting displaying the AR content, and then the display mode of the terminal is switched from the flat display mode to the combined display mode or the AR display mode.
In some possible embodiments, when the terminal determines that the operation instruction is not an instruction of the user to operate the AR image file, it may further determine whether the operation instruction is an instruction of invoking the AR image file for an application of the terminal, and if so, the terminal may also determine that the operation instruction is an instruction to start displaying AR content.
In some possible embodiments, the specific implementation manner of switching the display mode of the terminal from the planar display mode to the combined display mode or the AR display mode may be:
the terminal judges whether the display is required to keep displaying the plane content, for example, prompt information can be output to a user to prompt whether the plane content is required to be kept displayed, and if the user confirms the reserved operation aiming at the input of the prompt information, the display mode of the terminal is specifically switched from the plane display mode to the combined display mode; and if the user confirms the operation which is not reserved aiming at the input of the prompt message, the display mode of the terminal is specifically switched from the plane display mode to the AR display mode.
In some possible embodiments, when the terminal is in the AR display mode, the terminal determines whether the operation instruction is an instruction to start displaying the planar content, and if so, the display mode of the terminal is switched from the AR display mode to the combined display mode or the planar display mode, and the specific implementation manner may be: the terminal judges whether the AR content needs to be reserved and displayed, for example, prompt information can be output to a user to prompt whether the AR content needs to be reserved and displayed, and if the user confirms reserved operation aiming at the input of the prompt information, the display mode of the terminal is specifically switched from the AR display mode to the combined display mode; and if the user confirms the operation which is not reserved aiming at the input of the prompt message, the display mode of the terminal is specifically switched from the AR display mode to the plane display mode.
In some possible embodiments, when the terminal is in the combined display mode, the terminal determines whether the operation instruction is an instruction to close the display plane content or the AR content, and if the operation instruction is an instruction to close the display plane content, the display mode of the terminal is specifically switched from the combined display mode to the AR display mode; if the operation instruction is an instruction for closing display of the AR content, the display mode of the terminal is specifically switched from the combined display mode to the plane display mode.
In the embodiment of the application, a display of a terminal can be divided into a plurality of display areas, different display modes are defined for the terminal, the display areas are respectively used for displaying contents when the terminal is in different display modes, when the terminal is in a plane display mode, if an operation instruction is received, whether the operation instruction is an instruction for starting to display AR contents is judged, if yes, the display mode of the control terminal is switched into a combined display mode or an AR display mode from the plane display mode, so that diversification of the display modes of the terminal can be realized, switching of the display modes can be conveniently completed, AR experience is rapidly performed, and practicability of the terminal is improved.
Please refer to fig. 4, which is a schematic structural diagram of a display mode switching apparatus according to an embodiment of the present disclosure. The apparatus described in this embodiment is applied to the terminal shown in fig. 1, and the apparatus includes:
the receivingmodule 401 is configured to receive an operation instruction.
A determiningmodule 402, configured to determine whether the operation instruction is an instruction to start displaying AR content if the terminal is in a flat display mode, where the AR content includes a transmitted environment image and a pre-acquired AR image.
Acontrol module 403, configured to control the terminal to switch from the flat display mode to the combined display mode or the AR display mode if the determination result of the determining module is yes.
And when the terminal is in the combined display mode, the display of the plane content and the AR content is carried out simultaneously.
In some possible embodiments, thecontrol module 403 is specifically configured to:
and judging whether the plane content needs to be kept and displayed or not.
And if so, controlling the terminal to be switched from the plane display mode to the combined display mode.
If not, controlling the terminal to be switched from the plane display mode to the AR display mode.
In some possible embodiments, the determiningmodule 402 is specifically configured to:
and judging whether the operation instruction is an instruction for the terminal user to operate the AR image file.
And if so, determining the operation instruction as an instruction for starting displaying the AR content.
In some possible embodiments, the determiningmodule 402 is further configured to:
and if the operation instruction is not the instruction of the terminal user for operating the AR image file, judging whether the operation instruction is the instruction of calling the AR image file for the terminal application.
And if so, determining the operation instruction as an instruction for starting displaying the AR content.
In some possible embodiments, the determiningmodule 402 is further configured to determine whether the operation instruction is an instruction to start displaying the plane content if the terminal is in the AR display mode.
Thecontrol module 403 is further configured to control the terminal to switch from the AR display mode to the combined display mode or the planar display mode if the determination result of the determiningmodule 402 is yes.
In some possible embodiments, thecontrol module 403 is specifically configured to:
and judging whether the AR content needs to be kept and displayed.
And if so, controlling the terminal to be switched from the AR display mode to the combined display mode.
If not, controlling the terminal to be switched from the AR display mode to the plane display mode.
In some possible embodiments, the determiningmodule 402 is further configured to determine whether the operation instruction is an instruction to close displaying the plane content or the AR content if the terminal is in the combined display mode.
Thecontrol module 403 is further configured to control the terminal to switch from the combined display mode to the AR display mode if the operation instruction is an instruction to close displaying the planar content.
Thecontrol module 403 is further configured to control the terminal to switch from the combined display mode to the flat display mode if the operation instruction is an instruction to close displaying the AR content.
It can be understood that the functions of the functional modules of the terminal of this embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the relevant description of the foregoing method embodiment, which is not described herein again.
In the embodiment of the application, a display of a terminal can be divided into a plurality of display areas, different display modes are defined for the terminal, the plurality of display areas are respectively used for displaying contents when the terminal is in different display modes, when the display mode of the terminal is in a plane display mode, if the receivingmodule 401 receives an operation instruction, the judgingmodule 402 judges whether the operation instruction is an instruction for starting to display the AR contents, if so, thecontrol module 403 controls the display mode of the terminal to be switched from the plane display mode to a combined display mode or an AR display mode, so that diversification of the display mode of the terminal can be realized, switching of the display mode can be conveniently completed, AR experience is rapidly performed, and practicability of the terminal is improved.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wirelessly (e.g., infrared, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
In summary, the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (24)

1. A terminal is characterized by comprising a first processor, a second processor and a display, wherein the first processor and the second processor are respectively connected with the display, the display is divided into a plurality of display areas, and the display area comprises:
the display areas are respectively used for displaying contents when the terminal is in different display modes, the display modes comprise one or more of a plane display mode, an Augmented Reality (AR) display mode and a combined display mode, the terminal simultaneously displays the plane contents and the AR contents when in the combined display mode, and the AR contents comprise a transmitted environment image and a pre-acquired AR image;
the first processor is configured to control a display area corresponding to a current display mode to display the planar content when the terminal is in the planar display mode or the combined display mode;
and the second processor is used for controlling a display area corresponding to the current display mode to display the AR content when the terminal is in the AR display mode or the combined display mode.
2. The terminal of claim 1, wherein the plurality of display areas are a flat display area and a multi-function display area, the multi-function display area being used for displaying the flat content or the AR content, wherein:
the first processor is specifically configured to control the plane display area and the multifunctional display area to display the plane content when the terminal is in the plane display mode, or control the plane display area to display the plane content when the terminal is in the combined display mode;
the second processor is specifically configured to control the multifunctional display area to display the AR content when the terminal is in the AR display mode or the combined display mode.
3. The terminal of claim 1, wherein the plurality of display areas are multi-function display areas for displaying the planar content and/or the AR content, and wherein:
the first processor is specifically configured to control the multifunctional display area to display the planar content when the terminal is in the planar display mode;
the second processor is specifically configured to control the multifunctional display area to display the AR content when the terminal is in the AR display mode.
4. The terminal of claim 3,
the first processor is specifically configured to control a partial area of the multifunctional display area to display the planar content when the terminal is in the combined display mode;
the second processor is specifically further configured to control, when the terminal is in the combined display mode, an area other than the partial area in the multi-function display area to display the AR content.
5. A terminal according to any of claims 2 to 4, wherein the multifunctional display area is comprised of a transparent display screen, a micro projector, lenses and mirrors, wherein:
the lens is used for transmitting the environment image;
when the transparent display screen is started, the transparent display screen is used for displaying the plane content;
the micro projector is configured to project the AR image onto the lens through the lens when the transparent display screen is off.
6. A terminal as claimed in any one of claims 2 to 4, wherein the multifunctional display area is comprised of a transparent display, a liquid crystal display, a lens and a background light source, wherein:
the liquid crystal display screen is used for transmitting the environment image;
when the transparent display screen is started, the transparent display screen is used for displaying the plane content;
and when the transparent display screen is closed, the background light source is used for projecting the AR image onto the liquid crystal display screen through the lens.
7. The terminal of claim 1, wherein the plurality of display areas are a flat display area and an AR display area, and wherein:
the first processor is specifically configured to control the planar display area to display the planar content when the terminal is in the planar display mode or the combined display mode;
the second processor is specifically configured to control the AR display area to display the AR content when the terminal is in the AR display mode or the combined display mode.
8. The terminal of claim 7, wherein the AR display area is comprised of a first mirror, a micro projector, a lens, and a second mirror, wherein:
the first lens and the second lens are used for transmitting the environment image;
the micro projector is configured to project the AR image onto the second lens through the lens.
9. The terminal of claim 7, wherein the AR display area is comprised of a lens, a liquid crystal display, a lens, and a background light source, wherein:
the liquid crystal display screen and the lens are used for transmitting the environment image;
the background light source is used for projecting the AR image to the liquid crystal display screen through the lens.
10. The terminal of claim 1,
the first processor is a central processing unit, and the second processor is a graphics processor independent of the first processor.
11. A display mode switching method is applied to the terminal of claim 1, and is characterized in that the terminal comprises a first processor, a second processor and a display, the first processor and the second processor are respectively connected with the display, the display is divided into a plurality of display areas and is respectively used for displaying contents when the terminal is in different display modes, and the display modes comprise one or more of a plane display mode, an Augmented Reality (AR) display mode and a combined display mode; the method comprises the following steps: the terminal receives an operation instruction;
if the terminal is in a plane display mode, the terminal judges whether the operation instruction is an instruction for starting displaying AR content, wherein the AR content comprises a transmitted environment image and a pre-acquired AR image, and the terminal in the plane display mode comprises the terminal controlling a display area corresponding to the plane display mode to display the plane content through the first processor;
if so, the terminal is switched from the plane display mode to a combined display mode or an AR display mode, where the combined display mode includes that the terminal controls, through the first processor, a display area corresponding to the combined display mode to display the plane content, and controls, through the second processor, a display area corresponding to the augmented reality AR display mode to display the AR content, and the augmented reality AR display mode includes that the terminal controls, through the second processor, the display area corresponding to the augmented reality AR display mode to display the AR content;
and when the terminal is in the combined display mode, the planar content and the AR content are displayed simultaneously.
12. The method according to claim 11, wherein the terminal switches from the flat display mode to the combined display mode or the AR display mode, comprising:
the terminal judges whether the plane content needs to be reserved and displayed or not;
if so, switching the plane display mode to the combined display mode by the terminal;
if not, the terminal is switched from the plane display mode to the AR display mode.
13. The method according to claim 11 or 12, wherein the determining, by the terminal, whether the operation instruction is an instruction to start displaying the AR content includes:
the terminal judges whether the operation instruction is an instruction for operating the AR image file by a user of the terminal;
and if so, the terminal determines that the operation instruction is an instruction for starting displaying the AR content.
14. The method of claim 13, further comprising:
if the operation instruction is not the instruction of the terminal user for operating the AR image file, the terminal judges whether the operation instruction is the instruction of calling the AR image file for the terminal application;
and if so, the terminal determines that the operation instruction is an instruction for starting displaying the AR content.
15. The method of claim 11, further comprising:
if the terminal is in the AR display mode, the terminal judges whether the operation instruction is an instruction for starting to display the plane content;
and if so, the terminal is switched from the AR display mode to the combined display mode or the plane display mode.
16. The method of claim 15, wherein the terminal is switched from the AR display mode to the combined display mode or the flat display mode, comprising:
the terminal judges whether the AR content needs to be reserved and displayed or not;
if so, the terminal is switched from the AR display mode to the combined display mode;
if not, the terminal is switched from the AR display mode to the plane display mode.
17. The method of claim 11, further comprising:
if the terminal is in the combined display mode, the terminal judges whether the operation instruction is an instruction for closing the display of the plane content or the AR content;
if the operation instruction is an instruction for closing the display of the plane content, the terminal is switched from the combined display mode to the AR display mode;
and if the operation instruction is an instruction for closing the display of the AR content, the terminal is switched from the combined display mode to the plane display mode.
18. A display mode switching device, applied to the terminal of claim 1, wherein the terminal includes a first processor, a second processor and a display, the first processor and the second processor are respectively connected to the display, the display is divided into a plurality of display areas, and the display areas are respectively used for displaying contents when the terminal is in different display modes, and the display modes include one or more of a flat display mode, an Augmented Reality (AR) display mode and a combined display mode;
the device comprises:
the receiving module is used for receiving an operation instruction;
the judging module is used for judging whether the operation instruction is an instruction for starting to display the AR content if the terminal is in a plane display mode, wherein the AR content comprises a transmitted environment image and a pre-acquired AR image, and the terminal in the plane display mode comprises the terminal controlling a display area corresponding to the plane display mode to display the plane content through the first processor;
a control module, configured to control the terminal to switch from the planar display mode to a combined display mode or an AR display mode if a determination result of the determination module is yes, where the combined display mode includes that the terminal controls, through the first processor, a display area corresponding to the combined display mode to display the planar content, and controls, through the second processor, a display area corresponding to the augmented reality AR display mode to display the AR content, and the augmented reality AR display mode includes that the terminal controls, through the second processor, the display area corresponding to the augmented reality AR display mode to display the AR content;
and when the terminal is in the combined display mode, the planar content and the AR content are displayed simultaneously.
19. The apparatus of claim 18, wherein the control module is specifically configured to:
judging whether the plane content needs to be reserved and displayed;
if so, controlling the terminal to be switched from the plane display mode to the combined display mode;
if not, controlling the terminal to be switched from the plane display mode to the AR display mode.
20. The apparatus according to claim 18 or 19, wherein the determining module is specifically configured to:
judging whether the operation instruction is an instruction for operating the AR image file by a user of the terminal;
and if so, determining the operation instruction as an instruction for starting displaying the AR content.
21. The apparatus according to claim 20, wherein the determining module is further configured to:
if the operation instruction is not the instruction of the terminal user for operating the AR image file, judging whether the operation instruction is the instruction of calling the AR image file for the terminal application;
and if so, determining the operation instruction as an instruction for starting displaying the AR content.
22. The apparatus of claim 18,
the judging module is further configured to judge whether the operation instruction is an instruction for starting display of the plane content if the terminal is in the AR display mode;
the control module is further configured to control the terminal to switch from the AR display mode to the combined display mode or the planar display mode if the determination result of the determination module is yes.
23. The apparatus of claim 22, wherein the control module is specifically configured to:
judging whether the AR content needs to be kept and displayed;
if so, controlling the terminal to be switched from the AR display mode to the combined display mode;
if not, controlling the terminal to be switched from the AR display mode to the plane display mode.
24. The apparatus of claim 18,
the judging module is further configured to judge whether the operation instruction is an instruction to close displaying the plane content or the AR content if the terminal is in the combined display mode;
the control module is further configured to control the terminal to switch from the combined display mode to the AR display mode if the operation instruction is an instruction to close display of the planar content;
the control module is further configured to control the terminal to switch from the combined display mode to the flat display mode if the operation instruction is an instruction to close displaying the AR content.
CN201710400431.0A2017-05-312017-05-31 Display mode switching method, device and terminalActiveCN108984075B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201710400431.0ACN108984075B (en)2017-05-312017-05-31 Display mode switching method, device and terminal

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201710400431.0ACN108984075B (en)2017-05-312017-05-31 Display mode switching method, device and terminal

Publications (2)

Publication NumberPublication Date
CN108984075A CN108984075A (en)2018-12-11
CN108984075Btrue CN108984075B (en)2021-09-07

Family

ID=64500986

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201710400431.0AActiveCN108984075B (en)2017-05-312017-05-31 Display mode switching method, device and terminal

Country Status (1)

CountryLink
CN (1)CN108984075B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110101455B (en)*2019-04-302021-01-01微创(上海)医疗机器人有限公司Display device and surgical robot
CN111273975A (en)*2020-01-072020-06-12支付宝(杭州)信息技术有限公司Method and device for displaying shared vehicle information
CN116991519A (en)*2021-03-302023-11-03联想(北京)有限公司 A display control method and electronic device

Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103970409A (en)*2013-01-282014-08-06三星电子株式会社Method For Generating An Augmented Reality Content And Terminal Using The Same
KR20150116032A (en)*2014-04-032015-10-15(주)세븐피엠밴드Method of providing augmented reality
CN105700688A (en)*2016-03-172016-06-22京东方科技集团股份有限公司Virtual reality/augmented reality device
CN105739093A (en)*2014-12-082016-07-06北京蚁视科技有限公司See-through type augmented reality near-eye display
CN106101689A (en)*2016-06-132016-11-09西安电子科技大学Utilize the method that mobile phone monocular cam carries out augmented reality to virtual reality glasses
CN106251403A (en)*2016-06-122016-12-21深圳超多维光电子有限公司A kind of methods, devices and systems of virtual three-dimensional Scene realization
CN106293067A (en)*2016-07-272017-01-04上海与德通讯技术有限公司A kind of display changeover method and wearable display device
CN106526859A (en)*2016-12-142017-03-22中国航空工业集团公司洛阳电光设备研究所VR and AR compatible head-wearing display equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
IL313875A (en)*2013-11-272024-08-01Magic Leap Inc Virtual and augmented reality systems and methods

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103970409A (en)*2013-01-282014-08-06三星电子株式会社Method For Generating An Augmented Reality Content And Terminal Using The Same
KR20150116032A (en)*2014-04-032015-10-15(주)세븐피엠밴드Method of providing augmented reality
CN105739093A (en)*2014-12-082016-07-06北京蚁视科技有限公司See-through type augmented reality near-eye display
CN105700688A (en)*2016-03-172016-06-22京东方科技集团股份有限公司Virtual reality/augmented reality device
CN106251403A (en)*2016-06-122016-12-21深圳超多维光电子有限公司A kind of methods, devices and systems of virtual three-dimensional Scene realization
CN106101689A (en)*2016-06-132016-11-09西安电子科技大学Utilize the method that mobile phone monocular cam carries out augmented reality to virtual reality glasses
CN106293067A (en)*2016-07-272017-01-04上海与德通讯技术有限公司A kind of display changeover method and wearable display device
CN106526859A (en)*2016-12-142017-03-22中国航空工业集团公司洛阳电光设备研究所VR and AR compatible head-wearing display equipment

Also Published As

Publication numberPublication date
CN108984075A (en)2018-12-11

Similar Documents

PublicationPublication DateTitle
JP7062092B2 (en) Display control method and terminal
EP3525075B1 (en)Method for lighting up screen of double-screen terminal, and terminal
US11843715B2 (en)Photographing method and terminal
CN110456911B (en) Electronic device control method and device, electronic device, and readable storage medium
CN106921791B (en)Multimedia file storage and viewing method and device and mobile terminal
US20180150211A1 (en)Method for adjusting photographing focal length of mobile terminal by using touchpad, and mobile terminal
CN110531915B (en) Screen operation method and terminal device
CN106445340B (en)Method and device for displaying stereoscopic image by double-screen terminal
CN109947327B (en)Interface viewing method, wearable device and computer-readable storage medium
JP2018504798A (en) Gesture control method, device, and system
CN109542325B (en)Double-sided screen touch method, double-sided screen terminal and readable storage medium
CN107807772A (en)Image processing method, device and mobile terminal
US20150077437A1 (en)Method for Implementing Electronic Magnifier and User Equipment
CN109407948B (en)Interface display method and mobile terminal
CN110245601A (en) Eye tracking methods and related products
CN108958593B (en)Method for determining communication object and mobile terminal
CN110221795A (en)A kind of screen recording method and terminal
CN103399657B (en)The control method of mouse pointer, device and terminal unit
CN106371749A (en)Method and device for terminal control
CN110609648A (en) Application program control method and terminal
CN108958587A (en) Split-screen processing method, device, storage medium and electronic equipment
CN108446156A (en)A kind of application control method and terminal
CN108984075B (en) Display mode switching method, device and terminal
CN108196663B (en)Face recognition method and mobile terminal
CN110096213B (en)Terminal operation method based on gestures, mobile terminal and readable storage medium

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp